NSS No. 17-T (Rev. 1) Computer Security Techniques for Nuclear Facilities

Sekce Odstavec Text
Main 1.1. Nuclear security seeks to prevent, detect and respond to criminal or intentional unauthorized acts involving or directed at nuclear and other radioactive material, associated facilities and associated activities. Nuclear security of nuclear material and nuclear facilities includes physical protection, personnel related security (e.g. trustworthiness determination, measures against insider threats) and information security.
Main 1.2. Groups or individuals planning or committing any malicious act involving nuclear material or a nuclear facility might benefit from access to sensitive information and sensitive information assets related to the material, the facility or the security measures in place.
Main 1.3. The Nuclear Security Fundamentals [1] and the three Nuclear Security Recommendations publications [2–4] all emphasize the importance of securing sensitive information. IAEA Nuclear Security Series No. 23‑G, Security of Nuclear Information [5], provides guidance on appropriate measures for the identification, classification and securing of sensitive information to achieve effective information security within the State’s nuclear security regime.
Main 1.4. Cyber‑attacks at nuclear facilities can contribute to causing physical damage to the facility and/or disabling its security or safety systems (i.e. sabotage), to obtaining unauthorized access to sensitive nuclear information, or to achieving unauthorized removal of nuclear material. Computer security is therefore vital at nuclear facilities to protect both nuclear security and nuclear safety.
Main 1.5. The protection of sensitive digital assets1 (SDAs) is recommended in para. 4.10 of Ref. [2], which states:
Main 1.6. General guidance on computer security for nuclear security is provided in IAEA Nuclear Security Series No. 42‑G, Computer Security for Nuclear Security [7], and more specific guidance on computer security of instrumentation and control (I&C) systems in nuclear facilities is provided in IAEA Nuclear Security Series No. 33‑T, Computer Security of Instrumentation and Control Systems at Nuclear Facilities [8]. The current publication is intended to complement this guidance by providing details of computer security techniques for other systems at nuclear facilities.
Main 1.7. The objective of this publication is to assist Member States in implementing computer security at nuclear facilities with the aim of preventing and protecting against unauthorized removal of nuclear material, sabotage of nuclear facilities and unauthorized access to sensitive nuclear information. This publication addresses computer security for supporting activities and organizations such as vendors, contractors and suppliers. While the focus of this publication is on the security of nuclear facilities, application of this guidance may also benefit facility safety and operational performance.
Main 1.8. This publication addresses the use of risk informed approaches to establish and enhance computer security policies, programmes and measures to protect SDAs and other digital assets. A nuclear facility relies on SDAs and other digital assets for the safety and security of the facility. This publication describes the integration of computer security into the management system of a facility or organization, and it includes guidance on defining policy and requirements and on activities to develop, implement, sustain, maintain, assess and continually improve the computer security measures that protect the facility from cyber‑attacks consistent with the threat assessment or design basis threat (DBT) [9].
Main 1.9. This publication also provides technical guidance on protecting other digital assets at nuclear facilities.
Main 1.10. This publication is intended for regulatory bodies and other competent authorities and for operators of nuclear facilities and their vendors, contractors and suppliers.
Main 1.11. The guidance in this publication applies to the implementation and management of computer security for nuclear security purposes at nuclear facilities. This publication is applicable to all stages in the lifetime of a nuclear facility [10].
Main 1.12. Computer security at nuclear facilities is intended to protect a range of systems that contribute to different aspects of nuclear security, such as physical protection and nuclear material accounting and control systems. This publication does not address the design or operation of such systems, except as design or operation relates to the protection of those systems by computer security measures.
Main 1.13. This publication addresses all digital assets associated with a nuclear facility, including the facility’s I&C systems. Additional guidance on specific computer security considerations for the facility’s I&C systems that provide safety, security or auxiliary functions is provided in Ref. [8].
Main 1.14. Following this introduction, Section 2 introduces key terminology, basic concepts and relationships. Section 3 describes general considerations for computer security in nuclear facilities. Sections 4 and 5 present guidance on computer security risk management (CSRM) at the facility and system levels, respectively. Section 6 presents guidance on considerations for facility and system CSRM relevant to different stages in the lifetime of the facility. Section 7 presents an overview of a computer security programme (CSP). Section 8 presents an illustrative example of the implementation of defensive computer security architecture (DCSA) and associated computer security measures.
Main 1.15. The Appendix provides specific guidance on selected elements of a CSP. Annex I provides example attack scenarios that can be used to evaluate computer security at nuclear facilities. Annex II provides an example of the assignment of computer security levels for a nuclear power plant. Annex III provides an example of the application of computer security levels and zones.
Main 2.1. This section clarifies the meaning of important terms that are used throughout this publication.
Main 2.2. The Nuclear Security Fundamentals [1] state that the targets with respect to nuclear security are the following:
Main 2.3. Reference [1] states that a nuclear security system is “An integrated set of nuclear security measures.” Nuclear security measures are defined as follows:
Main 2.4. The general guidance on computer security [7] states: “The State should develop and maintain a national computer security strategy as part of its nuclear security regime”. As nuclear facilities are within the nuclear security regime, computer security at these facilities needs to be included in that national computer security strategy. Facility functions that support safety and security need to be protected from adversaries. When these facility functions make use of, depend on or are supported by digital technologies, computer security is needed to protect these functions.
Main 2.5. Computer security is concerned with computer based systems, especially those systems that perform or support facility functions important or related to nuclear security and nuclear safety (i.e. digital assets). Computer security provides techniques and tools to defend against cyber‑attacks and against human actions or omissions that might affect security.
Facility functions, computer security levels and computer security zones 2.6. A standard approach to protect systems in a structured way according to a graded approach is to use the concepts of computer security levels and computer security zones. The computer security level assigned to a computer security zone is based on the highest degree of security protection required by any facility function performed by a system within that zone. The same computer security level is assigned to all systems within that zone. Typically, a nuclear facility zone model consists of many different zones, and several zones may have the same computer security level assigned.
Facility functions, computer security levels and computer security zones 2.7. A facility function is a coordinated set of actions and processes that need to be performed at a nuclear facility. Facility functions include functions that are important or related to nuclear security and functions that are important or related to nuclear safety (i.e. safety functions).2 Facility functions are assigned to systems3, each of which performs one or more of these functions.
Facility functions, computer security levels and computer security zones 2.8. A computer security level is a designation that indicates the degree of security protection required for a facility function and consequently for the system that performs that function. Each computer security level is associated with a set of requirements imposed by the operator to ensure that the appropriate level of protection is provided to digital assets assigned to that level using a graded approach. Each computer security level will need different sets of computer security measures to satisfy the computer security requirements for that level.
Facility functions, computer security levels and computer security zones 2.9. A computer security zone is a logical and/or physical grouping of digital assets that are assigned to the same computer security level and that share common computer security requirements owing to inherent properties of the systems or their connections to other systems (and, if necessary, additional criteria). The use of computer security zones is intended to simplify the administration, communication and application of computer security measures.4
Facility functions, computer security levels and computer security zones 2.10. Additional criteria for defining computer security zones may include the following:
  • Organizational responsibilities, for example different computer security zones for systems that are the responsibility of different departments;

  • The need to maintain separation, for example different computer security zones for redundant systems at the same computer security level performing the same facility function;

  • Zones already defined for other purposes, for example a computer security zone defined for simplicity to be the same as a zone already established for administrative or communication purposes.

Facility functions, computer security levels and computer security zones 2.11. The idealized relationships between the concepts of facility function(s), computer security level(s), system(s) and computer security zone(s) are illustrated in Fig. 1.
Facility functions, computer security levels and computer security zones 2.12. Each of the idealized relationships is labelled in Fig. 1, and the labelled text below describes each relationship:
  • Each facility function is assigned to a single computer security level.

  • Each computer security level may be applied to one or more facility functions.

  • Each facility function is ideally assigned to one system, where possible.5

  • Each system ideally performs one facility function, where possible.6

  • Each computer security level may be applied to one or more security zones.

  • Each computer security zone is assigned a single computer security level.

  • Each system is placed within a single computer security zone, where possible.7

  • Each computer security zone may consist of one or more systems.

Computer security risk management 2.13. Facility CSRM (see Section 4) addresses facility functions and determines the assignment of these functions to computer security levels and to one or more systems. Systems inherit the computer security levels of the functions assigned to them.
Computer security risk management 2.14. System CSRM (see Section 5) is part of facility CSRM and addresses systems and determines (a) the boundaries of computer security zones according to the facility functions performed and system connectivity as well as (b) the computer security measures to be applied to meet the requirements for the computer security level of the zone.
Computer security risk management 2.15. Outputs of risk management processes typically rely on scenario development, analysis and, in some instances, performance to increase confidence in the qualitative assessments. There are two categories of scenarios: functional and technical. Functional scenarios are generally used in the facility CSRM process, and technical scenarios are used in the system CSRM process.
Competing demands of simplicity, efficiency and computer security 2.16. The competing demands of simplicity, efficiency and computer security need to be balanced when considering the following:
  • Identification and listing of facility functions;

  • Assignment of facility functions to systems;

  • Development of systems;

  • Specification of requirements for computer security for different computer security levels based on a graded approach;

  • Establishment of logical and/or physical boundaries for computer security zones.

Competing demands of simplicity, efficiency and computer security 2.17. Considerations of simplicity might lead to a preference to assign a single function to a single system. This might result in a DCSA that allows for the tailoring of efficient computer security measures within each zone for each facility function (assuming a one‑to‑one relationship between systems and functions). However, the systems would need interconnections to enable integration of separated facility functions, and therefore the system of computer security levels and computer security zones might become more complex owing to the larger number of computer security zones and interconnections between these zones.
Competing demands of simplicity, efficiency and computer security 2.18. However, considerations of efficiency in the performance of facility functions by systems might lead to a preference to assign multiple functions to a single integrated system. While this might result in a smaller number of computer security zones, the complexity of the system might increase, making it difficult to apply effective computer security measures throughout these zones. Additionally, assigning to the computer security zone a computer security level appropriate for the most important function of the system might further reduce efficiency because a higher level of protection than necessary might be applied to less important functions that have been integrated into the system.
Competing demands of simplicity, efficiency and computer security 2.19. The balance between efficiency and simplicity can also include balancing the performance of facility functions through systems, with the assignment of systems to computer security zones and computer security levels. Therefore, CSRM will typically involve a number of iterations of defining computer security zones and associated computer security measures to find the optimal balance between simplicity and efficiency. Iterations will need to show that proposed modifications of computer security zone definitions will not allow a compromise of the facility functions that would lead to worse consequences.
Conceptual nuclear facility zone model 2.20. An example of a conceptual nuclear facility zone model is shown in Fig. 2, with the following characteristics:
  • The example facility is associated with severe consequences in the event of unauthorized removal of material or sabotage.

  • The number of computer security levels is limited to five, with level 1 having the most stringent demands for protection and level 5 having the least.

  • Each system is placed within a computer security zone.

  • Each zone (including its systems) is assigned a computer security level.

  • One or more zones may be assigned the same computer security level.

Conceptual nuclear facility zone model 2.21. Figure 2 illustrates a conceptual application of systems, computer security levels and computer security zones. The computer security level assigned has the following impact on the requirements for the facility functions, systems and computer security zones:
  • The higher (more stringent) computer security levels are typically required for fewer functions (and consequently applied to fewer systems) than the lower security levels. In Fig. 2, security level 1 would apply to a minimal set of critical functions each ideally assigned to a single system, whereas security level 5 would allow for a single system to have many functions assigned to it.

  • The higher (more stringent) computer security levels are generally simpler (i.e. less complex) than the lower levels. In Fig. 2, zone Z1A contains a single deterministic system whose logical and physical interactions with other zones (and systems) are minimized to the greatest extent possible, whereas zone Z5B has very little restriction on interactions with other zones (and systems).

  • The complexity of zones is typically correlated with their physical and logical size. For example, in zone Z1A, the physical locations of SDAs are likely to be restricted to a vital area, whereas in zone Z3C, any digital assets might be anywhere within the protected area. The increase in physical area from vital area (Z1A) to protected area (Z3C) potentially increases both the number of access points and the number of authorized personnel requiring physical access and therefore having the potential to interact with the digital assets.

  • The logical size of a zone may be expressed as the number of addressable, installed digital assets within a system. For example, zone Z3A might be logically scoped to have a smaller number of assignable addresses for a limited capacity of digital assets, whereas zone Z5A might have a broader scope with more available logical addresses for current and future digital assets.

  • In these examples, the number of possible addressable digital assets increases in a manner similar to the example of physical zone size in para. 2.21(c). However, the installation of additional digital assets significantly affects the logical zone size but not the physical zone size.8 This means that the number of potential logical interactions increases only when additional digital assets are installed within a zone, thereby increasing the number and complexity of these interactions within the zone and its boundary.

Conceptual nuclear facility zone model 2.22. The rigour with which computer security zones are defined may depend on the security levels assigned to those zones. For example, for zone Z1A, both the physical and logical boundaries are strictly defined, whereas zone Z5A might only need strict definition of the logical boundary, and the physical boundary might be more loosely defined (e.g. within a data centre, cloud service or corporate office).
Conceptual nuclear facility zone model 2.23. System boundaries (logical and physical) can be useful in defining computer security zone boundaries. In practice, a zone may comprise one or more systems, each system comprising or supported by one or more digital assets to perform or support the assigned facility function.9
Conceptual nuclear facility zone model 2.24. Computer security zone boundaries generally have physical access control (e.g. locked cabinets, barriers, port blockers) and decoupling mechanisms for data flow (e.g. packet filters, firewalls, data diodes) to prevent cyber‑attacks or other forms of unauthorized access and to prevent errors propagating from one zone to another (especially from a zone with less stringent protection requirements to one with more stringent requirements).
Conceptual nuclear facility zone model 2.25. The zone model provides for a graded approach and defence in depth. A cyber‑attack originating outside the facility would need to defeat or bypass several layers of computer security measures before having the opportunity to compromise a system with computer security level 1, 2 or 3. The measures for computer security levels 4 and 5 can also contribute to the protection of the levels of higher protection.10 For example, providing early detection capabilities within zones assigned security level 4 or 5 would be advantageous in providing an opportunity to contain and mitigate the cyber‑attack before there is any impact on SDAs in levels 1, 2 or 3.
Conceptual nuclear facility zone model 2.26. In a graded approach, the strength of computer security measures put in place to protect a facility function is in direct proportion to the potential worst case consequences of a compromise of the facility function.
Conceptual nuclear facility zone model 2.27. Computer security measures are used for the following:
  • To prevent, detect, delay and respond to criminal or other intentional unauthorized acts;

  • To mitigate the consequences of such acts;

  • To recover from the consequences of such acts.

Conceptual nuclear facility zone model 2.28. Computer security measures may also be used for the following:
  • To decrease the susceptibility of digital assets to malicious acts;

  • To prevent non‑malicious acts from degrading nuclear security.

Conceptual nuclear facility zone model 2.29. Computer security measures can be assigned to one of three categories: technical control measures, physical control measures or administrative control measures (see Ref. [7]).
Conceptual nuclear facility zone model 2.30. Computer security measures might also contribute towards or be supported by other measures implemented for physical protection, personnel related security and information security. Section 8 provides an example of the application of computer security measures within a DCSA that has five levels.
Conceptual nuclear facility zone model 2.31. Computer based systems make use of, depend on or are supported by digital technologies. Computer based systems play an ever‑expanding role in the performance of important facility functions at nuclear facilities and associated operations. Increasingly, computer based systems are integrated into new designs and may be introduced into existing facilities during modernization or to increase productivity or reliability.
Conceptual nuclear facility zone model 2.32. Computer based systems are technologies that create, provide access to, compute, communicate or store digital information, or perform, provide or control services involving such information. These systems may be physical or virtual. These systems include desktops, laptops, tablets, other personal computers, smartphones, mainframes, servers, software applications, databases, removable media, digital I&C devices, programmable logic controllers, printers, network devices, and embedded components and devices. Some computer based systems are programmable, which provides the option to modify processing steps without changing the hardware. Computer based systems are susceptible to cyber‑attacks.
Conceptual nuclear facility zone model 2.33. In the context of this publication, the term ‘digital asset’ refers to a computer based system that is associated with a nuclear facility. Any digital asset that has an important role in the safety or security of a nuclear facility will be considered an SDA11.
Conceptual nuclear facility zone model 2.34. Computer security is concerned with the protection of computer based systems against compromise.12 Computer security is a subset of information security (as defined, for example, in ISO/IEC 27000 [11]) and shares many of the same goals, methodologies and terminology.
Conceptual nuclear facility zone model 2.35. The relationship between information security, sensitive information, sensitive information assets, digital assets and SDAs is shown in Fig. 3.
Conceptual nuclear facility zone model 2.36. A cyber‑attack is a malicious act with the intention of stealing, altering, preventing access to or destroying a specified target through unauthorized access to (or actions within) a susceptible system [8]. A cyber‑attack can be carried out by individuals or organizations and might target sensitive information or sensitive information assets. Cyber‑attacks have the following special characteristics:
  • They can be hidden.

  • Their execution can be delayed, condition based or remotely initiated.

  • Personnel (e.g. engineers, guards, operations and maintenance staff, contractors) can be deceived into unwittingly supporting the attack.

Conceptual nuclear facility zone model 2.37. Compromise of digital assets might provide pathways for, facilitate or assist in cyber‑attacks targeting SDAs, with a corresponding adverse impact on nuclear security and nuclear safety. Therefore, it is necessary to provide appropriate protection — based on a graded approach and defence in depth — to all digital assets associated with the facility to prevent their use in the compromise of SDAs.The compromise of an SDA degrades nuclear security and might result in a nuclear security event13 with consequences ranging as follows (from best to worst case):
  • No consequence;

  • Negligible consequences;

  • Limited consequences (including safety consequences such as an anticipated operational occurrence, and operational effects such as plant performance);

  • Moderate consequences (e.g. degraded capabilities to prevent, detect and respond to nuclear security events);

  • High consequences (e.g. unauthorized disclosure or loss of sensitive information);

  • Severe consequences (e.g. unacceptable radiological consequences due to sabotage, unauthorized removal of nuclear or other radioactive material).

Conceptual nuclear facility zone model 2.38. The capabilities of potential adversaries might include the effective use of cyber‑attacks. Therefore, SDAs are targets both for their effect on facility functions and as a means for adversaries to facilitate and achieve their goals, and might be specifically targeted.
Conceptual nuclear facility zone model 2.39. A safety function is “A specific purpose that must be accomplished for safety” [12]. Safety functions are necessary “for a facility or activity to prevent or to mitigate radiological consequences of normal operation, anticipated operational occurrences and accident conditions” [12].
Conceptual nuclear facility zone model 2.40. For example, the fundamental safety functions that are required for all plant states (Requirement 4 of IAEA Safety Standards Series No. SSR‑2/1 (Rev. 1), Safety of Nuclear Power Plants: Design [13]) are as follows:
  • Control of reactivity;

  • Removal of heat from the reactor and from the fuel store;

  • Confinement of radioactive material, shielding against radiation and control of planned radioactive releases, as well as limitation of accidental radioactive releases.

Conceptual nuclear facility zone model 2.41. Paragraph 3.46 of Ref. [2] identifies physical protection functions as detection, delay and response. Physical protection functions use defence in depth and apply a graded approach to provide appropriate effective protection.
Conceptual nuclear facility zone model 2.42. Physical protection functions and safety functions are not necessarily inherently related to each other, making it difficult to treat safety functions and physical protection functions coherently in risk assessment methodologies. Therefore, describing and designating facility functions important or related to security in a manner similar to facility functions important or related to safety (i.e. safety functions) will simplify the determination of the significance of facility functions and will enable the equal treatment of safety functions and security functions of equivalent significance. Some examples of facility functions important to security are the following:
  • Intrusion detection (including assessment) at a critical detection point;

  • Control of access of persons and equipment to Category I material or vital areas;

  • Communications to coordinate response forces during a nuclear security event.

Conceptual nuclear facility zone model 3.1. Reference [7] states:
Conceptual nuclear facility zone model 3.2. The operator should identify and list the facility functions for the entire facility in a consistent manner to ensure that the identified set of facility functions can be assessed holistically. The operator should provide the list of identified facility functions to the competent authority14 consistent with national regulations. The computer security requirements15 for these facility functions should be considered, whatever the means of performing the functions (e.g. the specific technology employed, whether analogue or digital).
Conceptual nuclear facility zone model 3.3. The performance of facility functions will rely on or be supported by related sensitive information, sensitive information assets and other associated digital assets.
Conceptual nuclear facility zone model 3.4. The operator should apply computer security measures to ensure the appropriate protection (including traceability) of sensitive information, sensitive information assets and SDAs. Computer security is provided by measures to ensure confidentiality, integrity and availability as well as to meet any other requirements specified by the competent authority.
Conceptual nuclear facility zone model 3.5. The operator should identify sensitive information, taking into account the effects of its compromise and the State’s requirements for the security of sensitive information. Reference [5] provides detailed guidance on the development of a State’s requirements for sensitive information.
Conceptual nuclear facility zone model 3.6. Sensitive information may be identified directly by considering the potential consequences associated with its unauthorized disclosure (as indicated in Ref. [5]), for example information on security arrangements, which an adversary might use in planning a malicious act. For this type of information, confidentiality is typically the attribute that most needs protection. Sensitive information may also be identified less directly by considering its functional significance (i.e. its importance to the provision or performance of a facility function), for example accurate and timely data on boiler pressure, which an adversary might be more likely to exploit by modifying or destroying. For this type of information, the integrity and availability of the information might be at least as important as confidentiality.
Conceptual nuclear facility zone model 3.7. The information in the site security plan may be classified as sensitive information and measures may be implemented to protect its confidentiality for an extended period of time, since the information will remain sensitive throughout the period for which the site security plan is valid.
Conceptual nuclear facility zone model 3.8. For an I&C system and its process data, an operator might give priority to those measures that ensure system availability and integrity over those that ensure confidentiality. In this case, the process data are important to the correct performance and availability of the function and are only sensitive during the very limited intervals when the I&C system is performing a control action based on the data. However, once the process data are no longer important to the performance and availability of the function (i.e. can no longer form the basis of a control action), the historical process data have value only based on their sensitivity. Therefore, the security benefit arising from the increased assurance of confidentiality (to protect information sensitivity) needs to be balanced against that from protecting integrity and availability.
Conceptual nuclear facility zone model 3.9. While protecting the confidentiality of process data from these systems might not need stringent measures, the loss of confidentiality of other data related to the systems, such as administration passwords, source code and other key details, would provide the adversary with a significant benefit in the planning and execution of cyber‑attacks targeting the system and might lead to a need for stronger measures. Additionally, classification of the historical process data (e.g. logs) to limit their distribution (e.g. application of administrative control) might be necessary to reduce the risk of unauthorized disclosure to an acceptable level.
Conceptual nuclear facility zone model 3.10. Computer security should be implemented using a risk informed approach. Figure 4 of Ref. [7] provides an overview of a risk informed approach to computer security measures.
Conceptual nuclear facility zone model 3.11. Risk, in the computer security context, is the risk associated with an adversary exploiting the vulnerabilities of a digital asset or group of digital assets to commit or facilitate a malicious act. This risk is expressed as a combination of the likelihood of a successful attack and the severity of its consequences if it occurs.
Conceptual nuclear facility zone model 3.12. The operator should establish and implement a CSRM process (unless the management process is performed by the competent authority). The competent authority may specify policy requirements to be followed and may require that a specific risk assessment methodology be used, or it may agree to the use of an operator’s methodology [7]. The assessment process for a facility may follow the example of the organizational computer security risk assessment as described in paras 7.10–7.16 of Ref. [7].
Conceptual nuclear facility zone model 3.13. The CSRM process should include a cyclical process for continual improvement16 in the management of risks associated with cyber‑attacks on the facility.
Conceptual nuclear facility zone model 3.14. Periodic and iterative risk assessments are used to support decision making within a risk management process. Computer security risk assessments are typically qualitative, involving relative metrics (e.g. high, medium, low), but could be quantitative if sufficiently reliable data were available.17 The results of risk assessments will assist in determining appropriate computer security requirements.
Conceptual nuclear facility zone model 3.15. The operator should perform CSRM for the facility to comply with regulatory requirements. Reference [7] indicates that this may include two complementary assessments, one at the organizational level and one at the system level, and such an approach should be adopted for complex, high hazard facilities, such as nuclear facilities. In the guidance in this publication, it is therefore assumed that CSRM for a nuclear facility (facility CSRM) includes a specific phase of risk assessment and management at the system level (system CSRM) (see Fig. 4). This implies two stages:
  • Assess and manage aggregated computer security risks to facility functions for the whole facility. This will ensure that the operator performs a complete assessment of the facility and will provide the competent authority with the primary means to assess the overall effectiveness of CSRM at the facility. Section 4 provides guidance on performing facility CSRM.

  • Assess and manage risks associated with each system that performs or supports those facility functions. This will ensure that the operator performs a detailed assessment of each system that performs or supports a facility function. The competent authority may require the detailed assessments as a means to review the effectiveness of specific instances of CSRM at the facility. Section 5 provides guidance on performing system CSRM.

Conceptual nuclear facility zone model 3.16. The operator should ensure independence between the teams responsible for performing overall CSRM to set the computer security requirements for the facility, those implementing the requirements and those validating that the requirements have been met.
Conceptual nuclear facility zone model 3.17. Risk management is relevant at all stages in the facility’s lifetime and throughout the life cycles of systems to inform the development, implementation and maintenance of computer security measures. Section 6 identifies risk management activities throughout the lifetime of a facility.
Conceptual nuclear facility zone model 3.18. A review of the risk assessment should be performed, and the risk assessment updated as necessary, in the following instances:
  • New information or important findings emerge that could invalidate assumptions stated in the current computer security policy, CSP, DCSA, site specific threat assessment.

  • A vulnerability is discovered that invalidates computer security measures or assumptions made in a system risk assessment.

  • A computer security incident occurs at the facility.

  • The national threat statement or DBT is modified (and the modifications are relevant to adversaries using cyber‑attacks or blended attacks). This might reflect new threats or enhanced adversary capabilities or resources that might increase the likelihood of successful cyber‑attacks.

  • There is a change to a facility function, system, SDA or computer security measure. This should include the introduction of any new equipment, software or procedures or any major change in skill sets of operating personnel. The level of effort to update the risk assessment can be informed by the assigned level of protection of the SDA (e.g. computer security level).

  • Regulatory requirements change.

  • A periodic review is due according to the continual improvement process to ensure that the assessment remains valid.

Conceptual nuclear facility zone model 3.19. Regulatory activities related to facility security, such as licensing, inspection and enforcement, should include appropriate consideration of computer security. Records from the risk management process and the resulting decisions and actions should be available for review by the competent authority on request to allow it to assess whether regulatory requirements are met.
Conceptual nuclear facility zone model 3.20. The overall structure and approach for the risk management process should include the following:
  • Facility CSRM:

    1. Definition of the scope of CSRM;

    2. Characterization of the facility;

    3. Characterization of the threats;

    4. Specification of requirements;

    5. Verification and validation;

    6. Acceptance by the competent authority.

  • System CSRM:

    1. Definition of system boundaries;

    2. Identification of digital assets (including SDAs);

    3. System computer security requirements;

    4. Verification.

  • Definition of the scope of CSRM;

  • Characterization of the facility;

  • Characterization of the threats;

  • Specification of requirements;

  • Verification and validation;

  • Acceptance by the competent authority.

  • Definition of system boundaries;

  • Identification of digital assets (including SDAs);

  • System computer security requirements;

  • Verification.

Conceptual nuclear facility zone model 3.21. Many methods exist for conducting risk assessment (see, for example, ISO/IEC 27005 [14]). Organizations need to choose a method and customize it to their specific organizational environment and objectives, while observing the need for separate facility and system level risk management.
Conceptual nuclear facility zone model 3.22. Computer security requirements and the design and implementation of measures to meet these requirements should be based on a graded approach, where computer security measures are applied in direct proportion to the potential consequences arising from compromise of the facility function. As indicated in Section 2, one practical way of applying a graded approach is to assign facility functions to computer security levels, where each computer security level is characterized by graded computer security requirements, and preventive and protective security measures can be selected to meet the requirements for the relevant level. Figure 5 illustrates the graded approach using computer security levels.
Conceptual nuclear facility zone model 3.23. While the requirements (e.g. explicit restrictions on communication between SDAs assigned to different levels) are fixed by the computer security levels, security measures (e.g. the specific type of firewall used to restrict such communications) can be chosen to protect digital assets (including SDAs) according to the architectural environment of the computer security level and the technology of the specific digital assets (including SDAs).
Conceptual nuclear facility zone model 3.24. In the computer security level approach, computer security requirements need to be defined for each level with the following considerations: (a) Generic requirements should be applied broadly throughout the facility and operating organization and may be applied to all digital assets. Generic requirements provide for improved nuclear security culture through a greater awareness of computer security. They also improve the computer security resilience and might provide additional defence in depth. Generic requirements cannot be credited with providing benefit to a specific computer security level or system because generic measures typically apply to a wide range of digital assets and cannot be relied on to be operated consistently and effectively.(b) Computer security levels are assigned, ranging from level 5 (least protection needed) to level 1 (most protection needed) (see Fig. 5). In this approach, systems containing SDAs would be in computer security levels 1–3, whereas systems in levels 4 and 5 contain other digital assets.(c) Computer security requirements are specified and applied according to the computer security levels assigned, in accordance with a graded approach. Computer security requirements should be based on defence in depth, whereby digital assets assigned to security levels affording higher protection do not rely solely on or implicitly trust digital assets or computer security measures of security levels with lower protection.(d) The computer security measures applied to meet the requirements for each computer security level should take into account the independence and diversity of the measures in order to reduce common vulnerabilities that could allow multiple layers of defence in depth to be bypassed or defeated. However, it might be necessary for some computer security measures applied in one computer security level to be repeated in other computer security levels.(e) With the application of a layered approach and defence in depth, computer security measures on lower levels can help protect the higher levels, especially with regard to early detection of cyber‑attack.(f) Computer based systems that are outside the control of the CSP are unassigned and should not be trusted by any digital asset at any computer security level.
Conceptual nuclear facility zone model 3.25. Section 8 provides guidance on computer security requirements for a graded approach using the example of five computer security levels plus generic computer security requirements.
Conceptual nuclear facility zone model 4.1. Facility CSRM is a complex process that should be performed by a multidisciplinary team of people who have skills and competencies in nuclear security, nuclear safety, operations, maintenance, computer security and engineering.18 This team might have a composition similar to that proposed for physical protection evaluations (see Ref. [15]).
Conceptual nuclear facility zone model 4.2. Facility CSRM is an iterative process that is conducted in phases. It might be necessary to review and modify assumptions, determinations or results from a previous phase on the basis of the results of a subsequent phase. Verification activities are expected to be performed between phases.
Conceptual nuclear facility zone model 4.3. The objective of facility CSRM is to assess and manage risks associated with cyber‑attacks that have the potential to degrade the nuclear security or nuclear safety of the facility.
Conceptual nuclear facility zone model 4.4. Facility CSRM should ensure that the regulatory requirements regarding computer security are met.
Conceptual nuclear facility zone model 4.5. Facility CSRM should take account of an assessment of identified adversaries who might attack the facility and their goals (e.g. sabotage, unauthorized removal of nuclear material or radioactive material, unauthorized access to sensitive information), including an evaluation of the attractiveness of targets19 in the facility to these adversaries. The State’s assessment of threats might be provided by the national threat statement or DBT20.
Conceptual nuclear facility zone model 4.6. Facility CSRM should include a determination of the significance of each facility function in accordance with that function’s importance to the operator’s objectives. These determinations may allow for the development of a hierarchical list21 of potential nuclear security events (from most severe to no consequence) resulting from compromise of a facility function22. Figure 7 of Ref. [7] may be used in the development of such a hierarchical list.
Conceptual nuclear facility zone model 4.7. Facility CSRM should include consideration of facility functions but not their technical implementation in systems and digital assets, which are considered in system CSRM (see Section 5).
Conceptual nuclear facility zone model 4.8. The use of a consistent approach to facility CSRM across all facilities within a State may assist the competent authorities in providing effective oversight with respect to the application of computer security at nuclear facilities.
Inputs to facility computer security risk management 4.9. The operator should use the following as inputs to facility CSRM:
  • The national threat statement or DBT, and associated analysis if available.

  • Applicable regulatory requirements and other documents. These may include the State’s information classification requirements.

  • The safety analysis for the computer systems of the facility. This safety analysis may be used in defining computer security requirements, but it is not sufficient for this purpose as it does not address all mal‑operations, notably those caused by malicious acts.

  • The site security plan [15]. The site security plan may be used in identifying the facility functions important or related to security and their significance in meeting the operator’s objectives. The site security plan may incorporate the facility’s CSP or aspects of it.

  • The facility computer security policy.

  • Current and previous facility CSP documents, including details of the assignment of facility functions to systems and the facility specific threat assessment.

Phases of facility computer security risk management 4.10. The following are the phases of facility CSRM:
  • Scope definition: Defining the scope of the risk assessment while taking account of the operator’s objectives for the facility (e.g. safety, security, operations, emergency preparedness), the physical and logical boundaries, and the stage of the lifetime of the facility. The prerequisites for and inputs to the assessment should be identified in this phase.

  • Facility characterization: Identifying facility functions and their interactions and interdependencies, identifying sensitive information that could be useful in planning an attack against the facility, and identifying targets on the basis of the identified facility functions and sensitive information.

  • Threat characterization: Analysing the national threat statement or DBT and any other relevant information or analysis of threats to identify specific tactics, techniques and procedures, along with adversaries’ skills, that could be used in cyber‑attacks (including blended attacks) on targets at the nuclear facility. The threat characterization is a model, developed through analysis of applicable aspects of the threat information, to generate a representation of adversaries who pose the greatest risk. This threat characterization phase bounds the range of credible attack scenarios.

  • Specification of computer security requirements: Generating facility level computer security requirements. The specification phase includes the following:

    1. Developing and documenting a CSP;

    2. Recommending any necessary amendments to the computer security policy;

    3. Assigning the identified facility functions to computer security levels;

    4. Creating or amending requirements for the DCSA.

      This phase may include applying analysis techniques (e.g. vulnerability assessment, threat assessment) and evaluation methods (see para. 4.98) to develop requirements from the facility characterization and threat characterization phases, and from regulatory requirements.

  • System CSRM: The CSP and the DCSA are applied to each system. System CSRM is described more fully in Section 5. Changes might be needed to the CSP and the DCSA in the light of the experience of implementing the CSP and the DCSA on each system.

  • Implementation of systems and their integration into the facility: This phase is not covered further in this publication. Changes may be needed to the DCSA and the CSP in the light of the practical engineering experience of implementation and systems integration.

  • Assurance activities: These are not strictly a phase of facility CSRM but rather a set of continuous activities that are also performed in each system CSRM process. Three types of assurance activity are used:

    1. Evaluation of compliance with computer security requirements;

    2. Verification of each phase of CSRM;

    3. Validation of the computer security of the facility.

      Scenarios are a vital part of evaluation, verification and validation activities.

  • Facility CSRM outputs: These outputs comprise the (revised) CSP, the DCSA, the site specific threat assessment and the facility CSRM compliance report.Some or all of these documents will be subject to review for acceptance by the competent authority. The output from CSRM may be an input to the further development by the State of its regulatory requirements.

  • Developing and documenting a CSP;

  • Recommending any necessary amendments to the computer security policy;

  • Assigning the identified facility functions to computer security levels;

  • Creating or amending requirements for the DCSA.

    This phase may include applying analysis techniques (e.g. vulnerability assessment, threat assessment) and evaluation methods (see para. 4.98) to develop requirements from the facility characterization and threat characterization phases, and from regulatory requirements.

  • Evaluation of compliance with computer security requirements;

  • Verification of each phase of CSRM;

  • Validation of the computer security of the facility.

    Scenarios are a vital part of evaluation, verification and validation activities.

Phases of facility computer security risk management 4.11. The phases of facility CSRM are shown in Fig. 6, which provides an overview of the facility CSRM process. These phases are described in more detail in the remainder of this section.
Phases of facility computer security risk management 4.12. There is one facility CSRM process per facility, within which there is a separate system CSRM process for each system. For a site that contains multiple facilities or for an organization that operates multiple facilities, there may be one process for the whole site or whole organization, resulting in one or more sets of facility CSRM output. In this case, the operator may decide how many sets of output to generate but should ensure that the process is comprehensively applied to each facility.
Phases of facility computer security risk management 4.13. The operator should identify the scope of facility CSRM, which will be the physical or logical extent of the facility functions and associated systems of concern for nuclear security. Considerations in defining the scope might include the facility’s physical perimeter; the locations of approved vendors, contractors and suppliers; the operating organization’s corporate offices; off‑site data centres; and any other strategic locations. The scope of assessment might also vary depending on the stage in the lifetime of the facility or the capability and maturity of the operating organization (see paras 5.26–5.29 of Ref. [7]).
Identification of facility functions 4.14. The operator should identify all facility functions without consideration of how those functions are performed. The presence and use of digital assets throughout the facility and throughout its lifetime make it likely that digital assets will be used to perform or support the majority of key tasks and activities related to facility functions.
Identification of facility functions 4.15. The stage in the lifetime of the facility [10] should be taken into account in characterizing the facility and identifying the facility functions. Different facility functions will be relevant at different lifetime stages, and their relative importance might change.
Identification of facility functions 4.16. Facility functions are characterized by the following elements:
  • Intrinsic significance: The importance of the facility function to nuclear security and nuclear safety and the potential consequences for the facility if the function is not performed correctly.23 This is the primary characteristic.

  • Potential effects of compromise: The manners in which the facility function could fail to be performed correctly.

  • Interdependencies between functions: The significance of a facility function might arise from other functions that depend on it.

  • The timeliness and accuracy with which the facility function needs to be performed.

Intrinsic significance of facility functions 4.17. The significance of all facility functions should be compared in order to group together those that have similar significance, if possible using a common scale that includes both security and safety considerations.
Intrinsic significance of facility functions 4.18. For facility functions important or related to nuclear security, a classification scheme based on consequences for nuclear security, such as that outlined in Fig. 7 of Ref. [7], should be used to determine the significance of the function.
Intrinsic significance of facility functions 4.19. For facility functions important or related to nuclear safety, an established safety classification scheme may be used to determine the significance of the function. However, security considerations may necessitate the assignment of higher significance than indicated by a function’s safety classification.
Intrinsic significance of facility functions 4.20. The determination of the significance of facility functions should take into account that the performance of safety functions (by systems) may support security and the performance of security functions may support safety. As a result, the significance assigned to a safety function for computer security may differ from its safety class.
Intrinsic significance of facility functions 4.21. For example, a system providing a facility function of detecting radiation for the protection of personnel (a safety objective) may also provide for the detection of unauthorized removal of nuclear material (a nuclear security objective). Although the failure of the radiation protection function from a safety perspective might have limited consequences, the consequences of failure for nuclear security might be more severe. Therefore, the facility functions provided by the system in this example would be assigned a significance value on the basis of their importance to the nuclear security objectives. (Alternatively, the operator could choose to implement independent systems to separate the functions that support nuclear safety and nuclear security, and in this example the function supporting nuclear safety could be assigned lower significance.)
Potential effects of compromise of a system on facility function 4.22. In addition to considering the intrinsic significance of the facility function, the operator should consider the effects on facility function of compromise of the system intended to perform it. These effects are as follows (arranged from worst to best case):
  • The performance of the facility function is indeterminate. This means that the function might be altered in any manner without the initial compromise being detected.

  • The performance of the facility function changes in unexpected ways (and other actions can be performed), but these anomalies are observable to the operator.

  • The performance of the facility function fails.

  • The performance of the facility function is as expected, meaning the compromise does not adversely affect the facility function (i.e. the system is fault tolerant).

Potential effects of compromise of a system on facility function 4.23. A system intended to perform a facility function might mal‑operate in different ways when compromised, and the effects of this mal‑operation depend on the circumstances and environment at the time of the compromise, the nature of the cyber‑attack causing the compromise, and the significance of the facility function. For example, a system performing a less important facility function might, through interdependencies and interactions between the functions, be used to attack a system performing a more important function.
Potential effects of compromise of a system on facility function 4.24. For each system and each type of effect of compromise (i.e. mal‑operation), there will be different consequences for the facility. These consequences should be assessed, and the significance assigned to facility functions should be based on these potential consequences. When assessing consequences, loss of confidentiality, integrity or availability of sensitive information should be considered, as well as consequences related to unauthorized removal of material or sabotage of the facility.
Potential effects of compromise of a system on facility function 4.25. The significance assigned to a facility function should take into account whether the facility function can be defined in a way that is valid for all possible conditions or modes on which the facility function might depend. If the facility function cannot be bound in this way, the list of consequences might be incomplete and additional analysis or assignment of a higher significance value (using a conservative approach) may be needed.
Interdependencies between facility functions 4.26. The determination of the significance of a facility function should also take into account the potential consequences of compromise (or mal‑operation) on other facility functions that depend on it. Examples of such functional dependencies include the following:
  • Information dependency: A facility function provides information to another facility function. Examples of mal‑operation include the following:

    1. Interruption of the automated control instructions for a facility process;

    2. Compromise of alarms provided to security officers;

    3. Display of incorrect plant monitoring information to operating personnel;

    4. Failure to provide information for emergency responders or nuclear security officers;

    5. Loss or manipulation of procedures or instructions, or of records that document the results of these procedures.

  • Engineering or physical resource dependency: A facility function provides a physical resource to another facility function. This includes resources needed to sustain the other facility function directly and resources needed to sustain those resources. Examples of mal‑operation include the following:

    1. Interruption of the provision of water or power;

    2. Unanticipated ambient environmental conditions;

    3. Failure to schedule preventive maintenance tasks;

    4. Failure of physical protection systems (e.g. access controls, intrusion detection).

  • Policy or procedural dependency: A change to one facility function necessitates a change to another facility function. For example, if policy demands that primary and secondary heat sink functions be provided when a reactor is critical, then if one of those heat sinks becomes unavailable, the reactor has to be put into a subcritical state.

  • Proximity effects: The effects on a facility function of mal‑operation or physical failure of other systems physically near to those that perform the facility function.

  • Interruption of the automated control instructions for a facility process;

  • Compromise of alarms provided to security officers;

  • Display of incorrect plant monitoring information to operating personnel;

  • Failure to provide information for emergency responders or nuclear security officers;

  • Loss or manipulation of procedures or instructions, or of records that document the results of these procedures.

  • Interruption of the provision of water or power;

  • Unanticipated ambient environmental conditions;

  • Failure to schedule preventive maintenance tasks;

  • Failure of physical protection systems (e.g. access controls, intrusion detection).

Interdependencies between facility functions 4.27. Analysis of interactions and interdependencies between facility functions might reveal that an important facility function has been omitted from the scope of the assessment. Dependencies might extend beyond the facility, for example the supply of water or power to the facility. Some functions provided by external organizations may need to be considered in the analysis of facility function dependencies. In this case, it may be necessary to revise the assessment scope to include those dependencies or to make changes at the facility that remove the dependencies.
Interdependencies between facility functions 4.28. Segregation of systems performing facility functions to limit the interactions and interdependencies between them might simplify the specification of computer security levels and requirements and might improve the effectiveness and efficiency of computer security measures.
Necessary timeliness and accuracy for facility function interdependencies 4.29. The determination of the significance of facility functions may also take into account the timeliness and accuracy with which one facility function needs to respond to another facility function. Timeliness can be considered in terms of requirements for the availability of sensitive information, and accuracy can be considered in terms of requirements for the integrity of such information:
  • Availability of information implies that, for example, alerts provided by one facility function are provided promptly to allow other facility functions such as assessing the alert and responding to the alert to be performed.

  • Integrity of information implies that, for example, a facility function provides accurate data on environmental variables (e.g. temperature, pressure, frequency, level) on which other facility functions depend.

Target identification 4.30. A target is defined in Ref. [1] as follows:
Target identification 4.31. Some systems performing facility functions will be targets and should be identified from the list of facility functions produced during facility CSRM, using the definitions of vital areas [16] and sensitive information [5]. Whether such a system is considered a target does not alter the significance of the facility function, but it is an additional consideration when determining computer security requirements.
Target identification 4.32. Targets that are associated with important facility safety and security functions should be identified as SDAs through the process described in paras 3.6–3.9. These SDAs should also be analysed for the potential value of any associated sensitive information. This will ensure that the SDAs and their associated information are considered within the facility’s information security programme and CSP and are afforded the appropriate level of protection.
Documentation of facility functions 4.33. The operator should document all facility functions identified and assessed during facility CSRM.
Documentation of facility functions 4.34. Identification of all the functions within the facility depends on having complete and accurate records describing the interactions and interdependencies between functions. These records will allow for the assessment of those functions that could have a negative impact on other functions if not performed correctly.
Documentation of facility functions 4.35. The interactions and interdependencies of a facility function might be internal or external and might be permanent or temporary. For example, during the development of systems, interaction might be needed between the development and operational environments through the physical transport of new software, data or devices, but these interactions could be removed when the systems are operational.
Documentation of facility functions 4.36. The operator should consider, when analysing the consequences of an attack directed at one facility function, the possibility that it could be part of an attack affecting multiple facility functions or part of a blended attack (i.e. combined cyber‑attack and physical attack).
Documentation of facility functions 4.37. The analysis may need to include an iterative assessment of each facility function, whereby one assessment is performed to determine the function’s intrinsic significance, and another is performed to determine the significance on the basis of interactions and interdependencies with other facility functions. The higher of the two levels of significance from these assessments should be used.
Documentation of facility functions 4.38. Those facility functions that have a direct relationship between the function not being performed correctly and the most severe consequences (e.g. those facility functions closely related to the three fundamental safety functions of controlling criticality, removing heat and containing material [12])24 should be assigned the greatest significance. In these cases, the assignment of significance should not take account of other parameters or factors.
Documentation of facility functions 4.39. Threat characterization depends on two separate continuous processes, which are interrelated:
  • The State’s assessment of threats and the development and maintenance of the national threat statement or DBT using intelligence sources;

  • The facility specific threat assessment, taking account of analysis of facility specific information and information on specific adversaries.

Sources of threat information 4.40. Paragraph 3.34 of Ref. [2] states:
Sources of threat information 4.41. The operator should put in place measures to identify, retain and manage specific information25 related to potential cyber‑attacks and adversaries (e.g. phishing emails, malware samples) to allow follow‑up analysis to support threat characterization. The operator should ensure that these measures are implemented in a manner that does not adversely affect nuclear security or nuclear safety.
Sources of threat information 4.42. The operator’s threat characterization might include elements of threat assessments performed by other organizations (e.g. the operator’s own assessments, open source intelligence reports).
Sources of threat information 4.43. The relevant competent authority is encouraged to provide an analysis of the specific information captured by the operator in a timely and cooperative manner and to support the exchange of this analysis and other important information, consistent with the State’s requirements for sensitive information [5]. Periodic reporting of incidents to the relevant competent authority by the operator may be valuable as threat analysis, and characterization is a continual activity that demands up to date information.
Sources of threat information 4.44. During the development of the national threat statement or DBT, the competent authority and other relevant State authorities should have (or should have access to) expertise and knowledge regarding potential computer security incidents (e.g. cyber‑attacks) on nuclear facilities.
Sources of threat information 4.45. Reference [7] provides guidance on the assessment of cyber threats to a nuclear security regime as well as detailed descriptions of potential sources of attack and associated attack mechanisms relevant to nuclear facilities, and of methodologies used to evaluate and identify threats.
Facility specific threat characterization 4.46. The operator should develop and maintain a facility specific threat characterization to support the evaluation of computer security risk to the facility. This should include an analysis of the national threat statement or DBT to characterize the specific nuclear security threats to the facility that contribute to the computer security risk. The analysis should describe the potential objectives, capabilities, tactics and techniques of relevant threats, providing the basis for formulating or validating the effectiveness of the facility computer security policy and CSP.
Facility specific threat characterization 4.47. The operator should perform the threat characterization in the following instances:
  • The operator performs a facility computer security risk assessment. This may sometimes be a less intensive analysis to check the previous analysis and assumptions.

  • The competent authority issues a new DBT or national threat statement.

  • The operator receives information that potentially invalidates assumptions made in the current analysis.

Facility specific threat characterization 4.48. The threat characterization done by the operator should describe the knowledge, capabilities and funding, as well as the possible campaigns, targets, tactics, techniques and procedures of identified potential adversaries, and any additional attributes of particular relevance. Paragraph 5.19 of Ref. [9] provides a list of possible additional attributes for threat characterization.
Facility specific threat characterization 4.49. The threat characterization done by the operator should identify potential combinations of tactics and techniques that might be used in an attack, such as coordinated remote and local actions, use of insiders and external adversaries, or blended attacks combining cyber‑attacks and physical attacks. The threat characterization should include the possibility of sequential or parallel cyber‑attacks with cumulative consequences, involving one or several adversaries, as well as cases where there are no indications of collusion between different adversaries (non‑collaborative attacks).
Facility specific threat characterization 4.50. The threat characterization done by the operator should allow for listing and assessment of credible types of attack. This list will form the basis of the computer security requirements and specification of the DCSA.
Facility specific threat characterization 4.51. The threat characterization should indicate whether the adversary has the capabilities to carry out a particular type of attack and whether the adversary can compromise a system performing a facility function in such a way that its behaviour is indeterminate (i.e. outside its design basis).
Additional considerations for insider threats 4.52. The threat characterization should include consideration of insider threats. Specific guidance is provided in Ref. [6]. For computer security, insider threats can be categorized as follows:
  • Passive insider: An insider with the motivation to enable but not initiate malicious acts. The computer security measures to counter a passive insider might rely on preventive measures, including having a strong security culture. Typically, a passive insider will not be deterred by detection measures, because their access to information and systems is legitimate, but will seek to avoid being identified as acting maliciously.

  • Active insider: An insider with the motivation to initiate malicious acts. There are likely to be fewer active insiders than passive insiders. The computer security controls to counter an active insider need to be more comprehensive than those to counter a passive insider and should include protective measures such as separation of duties and compartmentalization of information, physical access or system privileges.

  • Unwitting insider: An insider without the motivation to commit a malicious act and who is unaware of their exploitation by an adversary. For example, in a cyber‑attack, an unwitting insider might not be aware that certain actions can provide information or authenticated access to an adversary, such as by clicking a malicious link in an email disguised as being from a trusted source.

Additional considerations for insider threats 4.53. Adversary paths and the associated timelines for insider threats differ from other threats owing to insiders’ authorized access. This access allows insiders, for example, to use a non‑continuous series of tasks performed over an extended period of time. For example, the gathering of administrative credentials (through either social engineering or compromise of systems) to defeat measures such as access controls or segregation of duties could take place over several weeks, months or years.
Computer security policy and computer security programme 4.54. The operator’s computer security policy26 specifies the objectives and high level requirements for computer security of the facility, applying a gradedapproach and defence in depth. These high level requirements are specified by the operator, in compliance with applicable regulatory requirements, and are applicable without exceptions. The computer security policy is an input to facility CSRM, and facility CSRM may expand on and refine the facility computer security policy.
Computer security policy and computer security programme 4.55. The operator should develop and document its CSP27 as part of facility CSRM. The CSP is a framework for implementation of the facility computer security policy that will be used throughout the lifetime of the facility. The contents of a typical CSP are described in Section 7 and include the set of specific computer security requirements of the facility, in addition to those requirements identified by a risk informed approach.
Computer security policy and computer security programme 4.56. The operator should define computer security requirements in the CSP for the following, which are described in more detail in Section 7:
  • Organizational roles and responsibilities;

  • Risk, vulnerability and compliance assessment;

  • Organizational security procedures;

  • System security design and management;

  • Asset and configuration management;

  • Personnel management.

Computer security policy and computer security programme 4.57. The operator should specify within the CSP those baseline computer security measures that are mandatory for each computer security level. These measures are likely to consist of requirements that represent organizational policies and processes and will translate into procedures.
Computer security policy and computer security programme 4.58. Requirements for the strength of computer security measures should be identified and defined for each computer security level, consistent with regulatory requirements (if applicable). Exceptions to the application of a specific measure within a computer security level are strongly discouraged, and any such exceptions should be justified and documented within facility CSRM.
Computer security policy and computer security programme 4.59. The principal outputs from the specification phase of facility CSRM are the documentation of the CSP (or revised CSP) and a compliance report for the competent authority indicating how implementation of the CSP will ensurethat regulatory requirements are met. The CSP documentation may be a single document or a collection of separate documents but should include the following:
  • A statement that indicates the level of computer security protection to be provided for each computer security level. This statement could be qualitative or quantitative, but should be verifiable.

  • A requirement to perform and document periodic computer security reviews and risk assessments in each stage of the lifetime of the facility.

  • A definition of the roles and responsibilities needed to support computer security.

  • A specification for the DCSA, combining the computer security requirements derived from the operator’s application of a risk informed approach and any such requirements imposed on the facility by national law or regulations. The DCSA specifications should include the following:

    1. The requirements for applying a graded approach (e.g. the number of computer security levels);

    2. The requirements for defence in depth;

    3. Any additional requirements (e.g. for authenticity, non‑repudiation and traceability) necessary to meet the necessary level of protection for each computer security level;

    4. Requirements that will provide and maintain the capability to prevent, detect, delay, mitigate the effects of and recover from cyber‑attacks;

    5. The specific requirements for computer security measures for each computer security level to be applied to the respective computer security zones.

  • A record of the functional scenarios or other evaluation methods used in the analysis to develop requirements. It is important that other scenarios be developed independently to provide greater assurance in the requirements (i.e. increased confidence). Use of scenarios for raising confidence in the output of the specification phase is described in more detail in paras 4.116‑4.122.

  • The requirements for applying a graded approach (e.g. the number of computer security levels);

  • The requirements for defence in depth;

  • Any additional requirements (e.g. for authenticity, non‑repudiation and traceability) necessary to meet the necessary level of protection for each computer security level;

  • Requirements that will provide and maintain the capability to prevent, detect, delay, mitigate the effects of and recover from cyber‑attacks;

  • The specific requirements for computer security measures for each computer security level to be applied to the respective computer security zones.

Computer security policy and computer security programme 4.60. The operator should provide its CSP documentation for review by the competent authority, along with the compliance report.
Assignment of systems performing facility functions to computer security levels 4.61. Facility CSRM should include or make use of a prioritized list of facility functions, arranged in order of the significance of the facility function, as the basis for the application of a graded approach to provide the highest level of assuranceof protection to those functions that have the highest potential to lead to the most severe consequences.
Assignment of systems performing facility functions to computer security levels 4.62. The aim of the computer security level approach is to simplify the application of a graded approach. Computer security levels determine which set of computer security requirements are implemented to provide the appropriate level of protection to the system performing a facility function.
Assignment of systems performing facility functions to computer security levels 4.63. The operator should identify the number of computer security levels to be used, taking account of applicable regulatory requirements. For example, an operator could choose to apply a different computer security level for each facility function. However, the complexity of applying the approach increases with the number of computer security levels. Limiting the number of computer security levels allows for common approaches and methods to be applied to different systems. Therefore, the facility may choose to use a smaller number of levels. The benefit of simplicity in reducing the number of levels should be balanced against the cost in resources and efficiency of applying more stringent measures to facility functions than absolutely necessary in all cases.
Assignment of systems performing facility functions to computer security levels 4.64. The operator should ensure that each facility function is assigned to a single computer security level.
Assignment of systems performing facility functions to computer security levels 4.65. In some cases, facility functions important or related to security might not be sufficiently demarcated to allow them to be clearly distinguished from other functions. The inability to separate facility functions from one another increases the complexity in assigning the significance of the facility functions. Facility functions should therefore be distinct and independent from one another to the extent possible. The operator may consider modification of the facility functions with the aim of simplifying the application of the graded approach, which in turn might also be of benefit in applying defence in depth.
Assignment of systems performing facility functions to computer security levels 4.66. The operator should include the following in the CSP documentation:
  • The number of computer security levels and the requirements for their associated computer security measures;

  • The ordered list of facility functions, indicating how the functions have been assigned to computer security levels.

Defensive computer security architecture specification 4.67. The operator should design and implement a DCSA in which all systems performing facility functions are assigned to a computer security level and protected according to computer security requirements specified for that level.
Defensive computer security architecture specification 4.68. The operator should specify those baseline computer security measures that are mandatory for each computer security level within the DCSA. These baseline measures may include technical, administrative and physical control measures.
Defensive computer security architecture specification 4.69. The DCSA should be designed to eliminate or limit the possible routes for cyber‑attack (as identified in the threat characterization) that an adversary could exploit to compromise systems performing facility functions. Similar processes for reducing physical pathways available to the adversary are detailed in Ref. [16].
Defensive computer security architecture specification 4.70. Computer security boundaries28 should be established between systems performing facility functions that have different computer security levels.
Requirements in the DCSA specification to apply a graded approach 4.71. The DCSA specification should express the overall requirements (including the number of computer security levels) and should include the strength of measures for each computer security level, the strength of measures between different computer security levels and the rules for communication between zones at different computer security levels.
Requirements in the DCSA specification to apply a graded approach 4.72. The DCSA specification should ensure that facility functions with the highest significance are assigned to the most stringent computer security level. Requirements for communications between systems assigned to different facility functions should be defined. Data flow should be controlled between facility functions of different computer security levels in accordance with a risk informed approach.
Requirements in the DCSA specification to apply a graded approach 4.73. The DCSA specification should ensure that system design complexity is reduced where possible to simplify implementation of computer security measures. Decreasing the complexity of computer security measures can increase both performance and reliability.
Requirements in the DCSA specification to apply defence in depth 4.74. The DCSA specification should require the application of defence in depth through successive layers29 of computer security measures that have to be overcome or bypassed by an adversary in order to compromise systems performing facility functions.
Requirements in the DCSA specification to apply defence in depth 4.75. The DCSA specification should require a designed mixture of technical, physical and administrative control measures to provide defence in depth.
Requirements in the DCSA specification to apply defence in depth 4.76. The DCSA specification should require a design that ensures that a compromise or failure of a single computer security measure does not result in unacceptable consequences.
Requirements in the DCSA specification to apply defence in depth 4.77. The DCSA specification should require the use of independent and diverse measures to ensure that a common vulnerability cannot allow an adversary to compromise or bypass multiple layers of defence in depth with a single tactic.
Requirements in the DCSA specification to apply defence in depth 4.78. The DCSA specification should require the application of defence in depth between layers and within each layer. Layers of defence may use a combination of measures applicable to different computer security levels and apply them to different computer security zones. For the most severe consequences (i.e. high radiological consequences due to sabotage or unauthorized removal of Category I nuclear material), computer security measures should be implemented in multiple independent layers with the aim of providing deterministic and fail‑secure30 behaviour of systems in the event of cyber‑attack.
Requirements in the DCSA specification to apply defence in depth 4.79. The DCSA specification should be supported by an analysis report to identify computer security measures that are fail‑secure and deterministic within the application of defence in depth. This report may be requested by the competent authority to be submitted for review.
Defence in depth between layers 4.80. The DCSA specification should require each layer of defence in depth to be protected from cyber‑attacks originating in adjacent layers. Layers and their associated computer security measures should prevent or delay advancement of attacks.
Defence in depth between layers 4.81. The DCSA specification should require that the computer security measures used in a layer be selected and operated in a diverse and independent manner from those computer security measures used in an adjacent layer in order to mitigate common cause failures of protection mechanisms used for isolation between layers. In accordance with the principle of a graded approach, these requirements should be more stringent for those layers requiring the most stringent protection (i.e. computer security levels 1 and 2).
Defence in depth within a layer 4.82. The DCSA specification should require that a combination of computer security measures be employed within each layer to minimize the potential for a single compromise to overcome or bypass multiple measures. In accordance with the principle of a graded approach, these requirements should be greatest for those layers requiring the most stringent protection (i.e. computer security levels 1 and 2, with level 1 having the highest level of protection).
Trust model 4.83. The application of a graded approach and defence in depth should be consistent with an applicable trust model. Trust models that may be applied include the following:
  • Personnel trustworthiness (i.e. protection against insider threats) [6];

  • Sensitive (i.e. classified) information protection (e.g. Bell–LaPadula31);

  • Integrity protection (e.g. Biba, Clark–Wilson32).

Trust model 4.84. After specification of the computer security requirements, the implementation of those requirements proceeds as illustrated in Fig. 6 (see also Fig. 7). The implementation of requirements demands understanding of the ways in which facility functions are performed by digital assets.
Trust model 4.85. The risk management processes in facility and system CSRM have significant interactions (see Figs 6 and 7). Facility CSRM includes the assignment of one or more facility functions to individual systems and thus sets the scope for each system’s CSRM, but facility CSRM might also be affected by the outputs of system CSRM in an iterative process. For example, in physical protection systems, multiple facility functions may be assigned to a single system owing to the unavailability of systems with segregated functions. This restricts the ability to segregate the system into separate zones, thereby limiting the zone model to either a physical boundary or a logical boundary.
Trust model 4.86. For legacy facilities or systems, some structures, systems and components might not be modifiable or alterable. This might mean at the system CSRM phase that some requirements defined in facility CSRM cannot be met, and the operator might need to revise facility CSRM to determine a suitable CSP and DCSA specification that meets the security requirements.
Trust model 4.87. Facility and system CSRM should be reviewed and might need to be revised in the following instances:
  • The facility CSRM or facility safety analysis is revised.

  • The system cannot fully comply with requirements identified in the facility CSRM output.

  • System modifications are made that have the potential to affect facility CSRM.

  • Relevant security events or incidents occur.

  • New or changed threats or vulnerabilities are identified.

Trust model 4.88. The review of both the facility and system CSRM processes needs to be included in the facility change management process to ensure that they are consistent with one another and are kept up to date. These analyses also assist in setting the requirements (e.g. defining the computer security levels) for new systems or implementations.
Trust model 4.89. Trends in successive iterations of facility and system CSRM should be periodically assessed to identify the following types of adverse pattern:
  • A risk showing a clear pattern of increasing towards or beyond the unacceptable risk threshold. In this case, consideration should be given to ways to prevent the risk threshold being exceeded.

  • A risk reaching or exceeding the threshold. In this case, appropriate actions will need to be taken (e.g. reporting to the competent authority, implementing compensatory measures consistent with the urgency identified from risk trend data).

Trust model 4.90. Trends associated with individual systems should be analysed to ensure that the trend has not invalidated the facility CSRM output. For example, system surveillance assessments may be performed continually, and system performance monitoring reports may then be approved periodically. Outputs from the corresponding systems’ CSRM should be reviewed in the facility CSRM process to ensure there is no change in the overall facility risk.
Trust model 4.91. There are three types of assurance activity:
  • Evaluation, which provides confidence in the outputs of phases where verification is not possible (e.g. the threat characterization and computer security requirement specification phases). Owing to the nature of the information on which the computer security requirements are developed (e.g. estimations of the threat, assumptions about facility function failure modes due to compromise of systems), the operator cannot be certain that the requirements are correct. Therefore, evaluation is needed to give the operator confidence in the outputs of the computer security requirement specification phase, namely the CSP and the DCSA.

  • Verification, which provides confirmation that the results of a phase meet the objectives and requirements defined for that phase. Where possible, verification activities occur between successive phases of facility and system CSRM. This may involve a number of performance based methods or analyses to verify the outputs of each phase prior to their use as input in a subsequent phase.

  • Validation, which is the process of determining whether the computer security of the facility provides appropriate protection against the threat (as defined in the threat characterization) and complies with regulatory requirements.

Evaluation 4.92. The operator should evaluate the CSP and the DCSA to verify that their implementation will be effective in reducing the opportunity of adversaries to compromise systems performing facility functions, specifically through the following:
  • Identification and assignment of functions to computer security levels;

  • Assignment of computer security measures to those levels;

  • Specifications for computer security measures.

Evaluation 4.93. The evaluation of the CSP and the DCSA should include functional and performance testing in a manner that meets regulatory requirements. The evaluation should include consideration, as appropriate, of both facility and system CSRM and of the whole lifetime of the facility.
Evaluation 4.94. The operator should consider using independent experts to review its CSP and DCSA.
Evaluation 4.95. The operator should justify all assumptions about the likelihood of attacks or their success (e.g. vulnerability, exposure, opportunity) that are used in evaluation. The likelihood should be assumed to be 1 for postulated scenarios that can result in unacceptable radiological consequences33 or unauthorized removal of nuclear material (i.e. compromise of SDAs).
Evaluation 4.96. The national threat statement or DBT and the facility specific threat assessment provide the basis by which the operator can conduct an analysis to confirm the assumptions made during the assignment of facility functions to the appropriate computer security level. The use of credible functional scenarios (para. 4.120(a)) may allow for a greater level of assurance in the quality of the assessment (see Annex I for example scenarios).
Evaluation 4.97. Computer security measures based on the CSP and the DCSA provide detection, delay and response functions through physical (e.g. structure), technical (e.g. firewall) and administrative (e.g. personnel, procedures) control measures. The interaction of these computer security measures with the facility functions important to safety and security, and their assigned systems, makes the evaluation of the CSP’s effectiveness a challenging task.
Evaluation 4.98. A number of evaluation methods are available, including the following:
  • Attack tree analysis (also referred to as ‘attack vector analysis’ and ‘attack graph analysis’). This involves postulating a set of different possible adversary paths to determine whether there is high assurance that each attack will fail (i.e. that the adversary can be prevented from following the path) or be detected and responded to before the adversary reaches the objective. Attack tree analysis can be used, with the threat characterization, to assess whether the measures based on the CSP and the DCSA are effective in eliminating or minimizing the potential for an adversary to make the postulated attacks successfully.

  • Simulation. This includes computer based simulations of elements of the CSP (including the DCSA) and tabletop exercises that allow consideration of security and contingency plans as well as decision making by the adversary and computer security incident responders. These tools are used to judge the overall performance of the CSP, taking all measures into account. For example, tabletop exercises might assist in determining the opportunities available to an adversary on the basis of their capabilities and characteristics (e.g. whether they are insiders), or the vulnerabilities of the function.

  • Exercises. These can include both facility level and system level performance testing (e.g. penetration tests) as well as force‑on‑force exercises (e.g. for blended attacks) in either field conditions or test conditions. These exercises can address the effectiveness of the CSP in providing protection to the entire facility, parts of the facility, specific sets of systems or sets of measures against a simulated adversary attack. In this evaluation activity, data concerning the performance of computer security measures are collected and used to evaluate the overall effectiveness of the CSP.

Evaluation 4.99. Simulation and exercises are typically performed as part of scenario based analysis, in which postulated attacks (scenarios) are specified in detail and simulated or used as a basis for exercises. Scenario based analysis typically builds on attack tree analysis by considering specific adversary tactics and techniques for defeating computer security measures.
Evaluation 4.100. The effectiveness of the CSP, the DCSA or individual computer security measures can be evaluated quantitatively or qualitatively or both. The competent authority may prescribe deterministic evaluation methods to be used for different types of target, threat and scenario. It is suggested that the overall effectiveness of the CSP and the DCSA be conservatively defined as the lowest effectiveness that still meets regulatory objectives when all adversary tactics and techniques and credible scenarios have been considered.
Verification 4.101. The objective of verification in this context is to evaluate the quality of outputs from one phase against the specifications before that output is used in a subsequent phase.
Verification 4.102. Verification should, where possible, occur between successive phases of facility or system CSRM.
Verification 4.103. The results of verification might lead to the following actions by the operator:
  • Addressing any deficiencies in design or implementation of computer security measures to meet the requirements;

  • Identifying, analysing and implementing upgrades that might be necessary to address identified deficiencies and improve performance.

Verification 4.104. These verification activities might involve evaluation methods, including exercises, performance testing, simulation or analysis (e.g. vulnerability assessment) (see para. 4.98).
Verification 4.105. For example, evaluation of outputs based on attack tree analysis includes consideration of the flow of information between systems, devices, networks and locations. The exchange of information between systems can allow adversaries to exploit these pathways, potentially leading to compromise of systems and thereby of facility functions. Attack tree analysis at this stage considers generic pathways with the aim of minimizing or eliminating the possibility of an adversary gaining access to these pathways.
Verification 4.106. The operator should use a graded approach when determining the level of effort to be applied to verification and validation. The greatest level of effort should be applied to those functions or systems assigned to the most stringent computer security levels (i.e. those requiring the greatest level of protection).
Verification 4.107. Verification should be repeated on a regular basis (e.g. annually) or as needed to take into account any changes in targets or in the nuclear security programme requirements.
Validation 4.108. The operator should validate that the systems, when integrated together, have the appropriate level of protection to meet computer security requirements as expressed in the CSP and the DCSA. Figure 7 illustrates the verification and validations activities within the CSRM process, CSP and DCSA.
Validation 4.109. The operator should validate that the systems, as they are installed at the facility level, have the appropriate level of computer security protection to perform their facility functions to meet requirements as expressed in the facility security requirements.
Validation 4.110. The operator should validate that the level of computer security protection is sufficient to ensure that the operation of the facility meets regulatory requirements or operator requirements as expressed in the facility security requirements.
Validation 4.111. Where the validation indicates that the level of protection is not sufficient, the operator should revise its CSP and DCSA to increase protection. The operator may not reduce the level of protection without the agreement of the competent authority.
Validation 4.112. The operator should validate the outputs of both the facility and system CSRM processes. The facility CSRM outputs should be validated against the operator’s and regulatory requirements. The system CSRM outputs should comply with the CSP and DCSA requirements.
Validation 4.113. The operator should aggregate facility risk level, including reference to applicable regulatory and design requirements. This should also include the system risk level for each individual system that contains an SDA.
Validation 4.114. The operator should validate the facility and system level risk assessments against the national threat statement or DBT using scenarios that involve attacks affecting multiple systems and the overall architecture. These scenarios differ from those used in system CSRM (para. 5.5(j)) and those specified in the national threat statement or DBT. They might include blended attacks involving compromise of a number of separate systems with the aim of identifying vulnerabilities somewhere in the facility.
Validation 4.115. Full validation of the results of both facility CSRM and system CSRM should include consideration of both technical and functional scenarios as described below.
Scenario identification and development 4.116. The operator should identify and develop scenarios based on the State’s assessment of the threats as detailed in the national threat statement or DBT and, where appropriate, the facility specific threat assessment. Operators are strongly encouraged to include experts in cyber‑attacks and related threat capabilities in the development of these scenarios. This expertise can be found in competent authorities, intelligence services and law enforcement agencies. The operator might be required to provide these detailed scenarios to the competent authority for review and acceptance.
Scenario identification and development 4.117. Analysis of scenarios might provide insight into the most vulnerable points within the facility, processes, system architectures and procedures. Further analysis might be needed to identify computer security measures already in place or those that need to be added to address the identified vulnerabilities.
Scenario identification and development 4.118. Scenarios should be used in verifying the results of the facility computer security risk assessment, including the analysis of possible adversary tactics, likelihood of attack and potential consequences.
Scenario identification and development 4.119. The scenarios should be reassessed periodically to ensure that they remain sufficient to meet security objectives in the light of changes in the threats.
Scenario identification and development 4.120. There are two categories of scenario:
  • Functional scenarios, which are scenarios based on the threat assessments and which reflect the potential effects on facility functions of the compromise of systems performing those functions. These scenarios include those involving sabotage resulting in unacceptable radiological consequences and unauthorized removal of nuclear material. Functional scenarios can also be used to identify critical dependencies between functions or systems.

  • Technical scenarios, which are scenarios based on the specific technical implementation of computer security measures and which involve detailed information about the actual or potential implementation of digital assets. These scenarios can be assessed through performance based or tabletop exercises, typically as part of the verification and validation of both the facility and the system CSRM outputs.

Scenario identification and development 4.121. These scenarios are developed and analysed between facility CSRM and system CSRM phases, and within elements of facility CSRM if needed for analysis. These scenarios are necessary to raise confidence in the outputs of the requirement specification phase but can also be used to develop these requirements. The set of scenarios used for analysis to develop the requirements cannot be identical to the set of scenarios used in assurance activities.
Scenario identification and development 4.122. Scenarios considered should include multiple attack routes (e.g. via different networks and local systems), attacks involving insiders and blended attacks. They should also include the potential for sequential cyber‑attacks that multiply the consequence but that show no indications of collusion between different adversaries (non‑collaborative attacks).
Scenario identification and development 4.123. Scenarios can include the following: (a) Stand‑alone attacks by a single adversary; (b) Coordinated attacks by a group of adversaries working together; (c) Opportunistic attacks, in which independent adversaries effectively create a combined attack. For example, a vulnerability is publicly disclosed by one adversary, which allows other adversaries to target the facility systems and equipment; (d) Specific threat capabilities [9]; (e) Blended attacks with coordinated cyber and physical elements; Attack tree analysis can help in identifying threat scenarios as well as in identifying protective strategies.
Scenario identification and development 4.124. Scenarios should be periodically reviewed and updated in the following instances:
  • When the national threat statement or DBT is updated;

  • When significant modification of the facility is undertaken;

  • When changes are made to security processes, critical countermeasures and architectures;

  • When new credible attack routes are identified;

  • When new regulatory requirements are introduced;

  • When new critical vulnerabilities34 become known, especially those involving important computer security measures;

  • When the threat characterization changes.

Scenario identification and development 4.125. For the most significant scenarios, specific attack vectors and components should be identified and their risks documented.
Scenario identification and development 4.126. The facility CSP documentation should describe the computer security measures required to maintain protection against adversaries analysed during the assessment.
Scenario identification and development 4.127. The output of facility CSRM should comprise the facility CSP documentation and a determination of the aggregate facility risk based on an evaluation of the effectiveness of those measures identified in the CSP as providing protection against adversaries described in the national threat statement or DBT.
Scenario identification and development 4.128. The facility CSRM report should include a high level review and analysis of security system design and configuration management as detailed in the CSP. A more detailed analysis should be performed during system CSRM.
Scenario identification and development 4.129. Facility functions and their corresponding systems in the facility CSRM output should be addressed in comprehensive system level risk assessments as described in Section 5.
Scenario identification and development 4.130. The operator’s assessment of risk associated with different functions and aggregate facility risk should be provided to the competent authority.
Scenario identification and development 5.1. The operator should establish a systematic and periodically reviewed process for managing the computer security risk to digital assets, including SDAs, within the systems that perform the facility functions identified in the facility CSRM process35. Compromise of SDAs typically has the potential to lead to very high, high or medium severity consequences (as described in Ref. [7]). Facility CSRM should include system CSRM for each system, as described in this section. System CSRM should consider all digital assets in the system, including SDAs.
Scenario identification and development 5.2. System CSRM should be performed by a multidisciplinary team similar to that for facility CSRM. However, the composition of the system CSRM team may be tailored to address specific considerations associated with each system.
Scenario identification and development 5.3. The operator should use a graded approach when determining the level of effort to be applied to risk management for each system. The greatest level of effort should be applied to those systems that perform or support the facility functions assigned to the most stringent computer security levels (i.e. those requiring the greatest level of protection) as determined in the facility CSRM process.
Scenario identification and development 5.4. The primary objective of the system CSRM is to evaluate and manage the computer security measures to ensure that they provide the appropriate level of protection for the specific system (i.e. that required for its computer security level) according to the requirements defined in the facility CSRM output.
Scenario identification and development 5.5. To meet this objective, system CSRM includes the following steps:
  • Assessing each facility function, the systems assigned to perform the function and the computer security level applied to those systems — taking account of other facility functions that have interactions and interdependencies identified in the facility characterization phase of facility CSRM — to define the functional boundaries of the systems.

  • Identifying the scope of each system, including those systems that support other facility functions that interact with and depend on the function performed by the system. This can include analysis of the overall system architecture to identify the locations, boundaries, interfaces and communication paths of systems containing digital assets, including SDAs.

  • Identifying (and creating an inventory of) digital assets within those systems.

  • Defining and establishing computer security zones on the basis of the requirements identified in the facility CSP and the DCSA.

  • Identifying SDAs and other digital assets within the zone boundaries by asset analysis, which is an assessment of the digital assets to determine whether they are vital to the performance of the facility function.

  • Assigning digital assets, including SDAs, to the computer security level assigned in the facility CSRM output to their facility security or safety function.

  • Applying to the whole zone the most stringent computer security level assigned to any of the functions provided by digital assets within the zone, and assigning all of the digital assets within the zone to that level.

  • Applying baseline computer security measures (see paras 4.58 and 4.68) and additional computer security measures to the SDAs and other digital assets (including at the zone boundaries), taking into account the specificities of the identified systems to meet the requirements of the assigned computer security levels.

  • Providing a process to determine the technical control measures, administrative control measures or physical control measures that can be applied to meet the baseline computer security measures.

  • Analysing specific attack routes, scenarios and vulnerabilities to verify the effectiveness of the applied computer security measures.

  • If the analysis shows that a system is not sufficiently protected by the baseline computer security measures, applying additional or compensatory measures to reduce the risk to an acceptable level.

  • Developing a system CSRM report for the identified system.

Scenario identification and development 5.6. This process may result in the identification of other digital assets that were not part of the systems assigned to facility functions during facility CSRM, or were identified as being outside a system or zone boundary during system CSRM. In such cases, additional analysis should be performed to ensure the inclusion of all associated digital assets in the assessment and the CSP.
Scenario identification and development 5.7. The outputs of system CSRM should include the prioritization of risks within the system to determine the appropriate implementation of computer security measures. The process should include consideration of the location of the components that make up the system, vulnerabilities, and computer security levels and zones if defined, as well the significance of SDAs and other digital assets within the system under assessment.
Scenario identification and development 5.8. The operator should perform system CSRM in the following instances:
  • When a facility is first constructed (for every system);

  • When a facility is modified (for every system);

  • When a new system or digital asset is deployed (for every affected system);

  • When a system or digital asset is modified (for every affected system);

  • When the facility CSRM process is revised (for every system).

Scenario identification and development 5.9. The following inputs should be identified and made available for use during system CSRM:
  • Facility CSRM outputs (e.g. the CSP and DCSA specifications);

  • The safety analysis report;

  • The site security plan;

  • The computer security policy.

Overall defensive computer security architecture requirements for computer security 5.10. The operator should use the requirements for the DCSA set out during facility CSRM to design, implement and maintain computer security measures for systems and digital assets to prevent, detect, delay, mitigate and recover from cyber‑attacks.
Overall defensive computer security architecture requirements for computer security 5.11. Computer security measures should be effective throughout the lifetime of the facility, for example during periods of maintenance and decommissioning, when significant configuration changes may be made. Monitoring, maintenance and recovery activities should not provide means by which an adversary might bypass computer security measures, for example bypassing the protection on communication pathways between facility functions that have different computer security levels.
Overall defensive computer security architecture requirements for computer security 5.12. Computer security boundaries36 should be applied between computer security zones and should be protected using different computer security measures.
Overall defensive computer security architecture requirements for computer security 5.13. Data flow should be controlled between zones of different computer security levels and between zones of the same computer security level, using a risk informed approach, to ensure that the DCSA remains effective.
Definition of system boundaries 5.14. The system boundary defines the scope for each system’s CSRM and encompasses the systems identified as providing a particular facility function on the basis of the facility characterization. This should include considerations of interdependencies between facility functions and their systems.
Definition of system boundaries 5.15. System CSRM should include identifying and documenting the system boundaries. These include all the components, subcomponents, interfaces and environments of the system in question during all stages in the lifetime of the facility, as well as those other systems that provide support or auxiliary functions.
Definition of system boundaries 5.16. The following steps can be used to define the boundaries of the system under assessment:
  • Identify all the interfaces of the system.

  • Identify all the points at which data enter and leave the system (points at which an adversary might be likely to attempt to inject malicious code). Any means of injecting malicious code into the system should be considered in the system security risk assessment. For example, malicious code could be injected via communication connections, supplied products and services, or portable devices that are temporarily connected to target equipment.

  • Identify the procedures that involve interaction with the system in normal operation and in specific circumstances (e.g. patching).

  • Identify which data pathways (if any) are not used by any procedures during system operation and maintenance. Unused data pathways represent a significant vulnerability.

  • Identify the assigned computer security level of the system (from the facility CSRM output).

  • List the computer security measures applied to the system or its environment.

Definition and construction of computer security zones 5.17. The CSP and DCSA specifications produced during facility CSRM place computer security requirements on the implementation of the zone model. The CSP will also include a list of facility functions and the systems assigned to them.
Definition and construction of computer security zones 5.18. The operator should implement computer security measures to meet the requirements set out in the DCSA specification. In doing so, consideration should also be given to achieving the following [8]:
  • Systems belonging to the same zone form a trusted area for internal communications between those systems, and the computer security level applied throughout a zone that has such a trusted area is the most stringent level of those assigned to the systems involved.

  • Safety architecture requirements (e.g. redundancy, diversity, physical and electrical separation, single failure criterion) are maintained.

  • Defence in depth is applied both within each computer security zone (by using diverse, independent and overlapping administrative, physical and technical control measures) and between computer security zones.

  • Technical control measures to provide continuous or automatic preventive or protective actions (i.e. requiring no human intervention) complement physical or administrative control measures (i.e. requiring human intervention) where appropriate.

  • All connections between zones have decoupling mechanisms for data flow, operating according to zone dependent rules to prevent unauthorized access and undesired interactions between the zones. This includes continuous network connections and intermittent connections, for example using removable media.

  • The level of decoupling between zones is dependent on the computer security levels of the two zones. Decoupling measures include technical control measures, such as packet filters, firewalls and data diodes, at zone boundaries to restrict data flow and communication between different zones.

  • Permitted communications between zones at different security levels follow requirements specified for the levels involved in the CSP. The development of requirements for permitted communications may include consideration of trust models (see para. 4.83).

  • If the requirements in the CSP permit SDAs from zones assigned to different security levels to communicate, the connection is allowed to be initiated only by the SDA assigned to the higher (more stringent) computer security level. SDAs performing sensitive information management functions typically do not allow for communication from higher to lower levels (i.e. information flows in the opposite direction), in accordance with the Bell–LaPadula trust model (see para. 4.83).

  • If communication initiated by the SDA subject to the lower computer security level37 is unavoidable and in violation of its trust model, exceptionally stringent decoupling mechanisms are used.

  • Logical or physical access to digital assets in a zone by permitted mobile devices or other temporary equipment is treated as a form of intermittent connection to that zone and is subject to computer security measures for both the established zone and the temporarily connected devices. Such devices are subject to additional computer security measures if they connect to more than one zone.

  • Zones can be partitioned into subzones to improve the configuration and to prevent undesired interactions with other systems.

Definition and construction of computer security zones 5.19. Digital assets should be considered for separation into distinct zones when any of the following conditions are met:
  • The digital assets belong to systems that perform different facility functions.

  • Systems contributing to the same facility function are assigned different computer security levels.

  • Systems contributing to the same facility function and assigned the same computer security level are managed by different organizational units.

  • Servers communicate with multiple clients (e.g. those used with distributed control systems and programmable logic controllers). The zone requiring the most stringent protection should contain the minimum possible number of unique assets.

  • Systems need to communicate with common infrastructure components used by multiple systems (e.g. directory services, time servers, security log collectors) but not with one another. Communication between zones containing these types of system and zones containing the common infrastructure components needs to be monitored and controlled.

  • The systems are administration systems (especially when the same systems are used to administer several functional systems).

  • Regulations require distinct zones.

Definition and construction of computer security zones 5.20. Digital assets may be considered for assignment to different zones, despite being assigned the same computer security level, in the following cases:
  • The digital assets are in systems performing different facility functions. In such cases, assignment of digital assets to different zones may improve the separation of the zones and systems that contribute to a facility function.

  • Different organizational units are responsible for different digital assets.

  • There are isolated digital assets, or several digital assets of the same functional system are hosted on an isolated network.

  • Separate redundant systems performing the same facility function need to be assigned to individual zones.

  • Regulations require separation of the digital assets.

Definition and construction of computer security zones 5.21. Network connections and local exchanges (e.g. via removable media or mobile devices) of data between systems in different zones should be limited to only those that are essential. Where network connections across zone borders are essential, they should be established from the zone with the higher computer security level to the zone with the lower computer security level. Restrictions can be applied using technical control measures (e.g. filtering devices) or administrative control measures (e.g. rules for the use of removable media on a specific system).Network connections and methods that are permitted for disconnected exchange of data should be documented.
Definition and construction of computer security zones 5.22. A specific zone can only include systems (and digital assets) of the same computer security level. The zone is assigned the computer security level of the systems within the zone. A given computer security level can and should apply to different zones. However, in some specific cases it might be difficult to separate systems assigned to different computer security levels into different zones. In such cases, some systems could become part of a zone assigned a more stringent computer security level than they need.
Definition and construction of computer security zones 5.23. Communications should be allowed only between zones of the same computer security level or adjacent levels. Communications between zones with different computer security levels should be limited to specific zone entry points (e.g. one entry point filtering connections between zones with computer security level 2 and zones with computer security level 3). Security measures for all entry points should be defined in an efficient and consistent manner to enforce a secure overall architecture. Specific checks should be applied at a zone entry point, for example on the content of data (e.g. acceptable ranges of parameter values) entering or leaving, or the data’s digital signature. Zone entry points should also have specific event log monitoring.
Identification of digital assets 5.24. The following records should be consulted when identifying a system’s digital assets:
  • System asset database (of all digital components);

  • Software and firmware inventory;

  • Lists of sensitive information relevant to the system [5];

  • System network and architecture diagrams;

  • Facility design documents such as the safety analysis report or test reports;

  • Data flow diagrams;

  • List of user and system accounts and privileges;

  • Procedures related to the identified system.

Identification of digital assets 5.25. The list of digital assets may include their identifiers, key technical specifications and data, descriptions of their interfaces, references to facility level and system level risk assessments, and their assigned owners.
Identification of digital assets 5.26. The list of digital assets should be maintained during the lifetime of the facility and periodically reviewed. The list should also be reviewed and updated if necessary whenever a system level risk assessment is performed.
Identification of digital assets 5.27. Digital assets that are also sensitive information assets should be designated as SDAs. Digital assets that might facilitate or contribute to an adverse effect on the function of SDAs should also be identified and considered in the digital asset analysis to determine, consistent with the CSP, whether they should be designated as SDAs.
Identification of digital assets 5.28. The list of SDAs should be classified and protected as sensitive information.
System computer security architecture, including digital asset analysis 5.29. The operator should identify key tasks and activities necessary to provide computer security for the facility. These tasks and activities should be associated with computer security levels and their corresponding computer security measures. The operator should ensure that the necessary resources and capabilities are available to perform those tasks and activities.
System computer security architecture, including digital asset analysis 5.30. The system CSRM process should identify all SDAs. Digital assets that are not SDAs may also need to be considered in the analysis of specific threats or types of attack if their compromise could adversely affect an SDA. The level of effort associated with the system level risk assessment should be graded to ensure that those systems assigned the highest computer security level are also subject to the most robust assessment.
System computer security architecture, including digital asset analysis 5.31. In general, systems that perform the same facility function should be assigned the same computer security level, including independent, diverse and redundant systems. The assignment of a less stringent computer security level to any such systems is strongly discouraged and may be considered only on a case by case basis if supported by a specific justification and security risk analysis.
System computer security architecture, including digital asset analysis 5.32. Asset analysis of SDAs should include consideration of information about the hardware, firmware and software of the SDA, which can be used as input to a vulnerability analysis. The vulnerability analysis may lead to a recommendation to perform procedures to identify, disable or remove unneeded services, ports or interfaces on the system (or network) of the SDA to reduce attack surface (i.e. system hardening; see para. A.64).
System computer security architecture, including digital asset analysis 5.33. The interfaces of each system (including its digital assets) should be analysed and categorized with respect to the zone boundary. The following categories may be used:
  • Trusted internal communications: This category would include communications within and between systems or within a zone or between digital assets within a system, including internal pathways to devices at the zone boundary (e.g. firewalls, data diodes). There are no computer security measures that can effectively monitor or protect internal trusted communication pathways against cyber‑attack.

  • Authorized external communications: This category would include connections between zones via authorized permitted pathways and boundary devices. Such communications are normally between separate systems performing different facility functions. Computer security measures in the form of boundary devices ensure that all communication pathways, whether digital or analogue, are continually monitored and only those that are authorized can be used.

  • Potential unauthorized communications: This category would include the capability to make unauthorized connections between zones, for example using network cables, wireless connections or removable media. Such unauthorized communication pathways could be made between systems or digital assets that are in different zones but in physical or logical proximity, for example systems that are physically located in the same area with no physical barriers controlling access between them.

System computer security architecture, including digital asset analysis 5.34. All digital assets with trusted internal communication pathways within a zone should be assigned to the same computer security level, namely that of the zone.
System computer security architecture, including digital asset analysis 5.35. Zone boundary devices should be assigned to a computer security level equivalent to the highest (most stringent) level applied to the equipment for which they are intended to provide protection. For example, a firewall between two zones of different computer security levels may have a trusted internal communication pathway with the zone assigned the higher computer security level but only an authorized external communication pathway with the other zone.
System computer security architecture, including digital asset analysis 5.36. Another example of a zone boundary device may be a malware detection kiosk, or antivirus scanner, which is used to scan removable media and mobile devices before entering and exiting a zone. This kiosk would be assigned the highest computer security level applied to anything in the zone for which it is intended to provide protection.38 In this case, the operator needs to ensure that the kiosk does not provide a common route for the compromise of different systems in different zones (e.g. by providing a common vulnerability that can be exploited to compromise different systems).
System computer security architecture, including digital asset analysis 5.37. All digital assets, including SDAs, that are connected via a trusted internal communication pathway should comply with the overall DCSA requirements. Permitted external communications need additional computer security measures (see para. 5.33(b)).
System computer security architecture, including digital asset analysis 5.38. SDAs may be allowed to be in proximity (logical or physical) to other SDAs provided that computer security measures are in place to ensure that these systems cannot interact though potential unauthorized communication pathways. These measures might be solely administrative control measures. Typically, SDAs are assigned to the higher computer security levels (e.g. levels 1–3).
System computer security architecture, including digital asset analysis 5.39. Digital assets that are not authorized to communicate with SDAs should not be allowed to be in logical or physical proximity to SDAs where there is the potential to have unauthorized communication pathways. The DCSA should provide for the design and maintenance of robust computer security measures to eliminate such pathways or create compensatory measures to reduce the potential for them to be used.
System computer security architecture, including digital asset analysis 5.40. Unassigned digital assets (i.e. those not assigned to a computer security level) should never be in proximity to SDAs. For example, a vendor’s equipment or personal mobile devices that have not been evaluated and assigned should be treated as potentially malicious devices to SDAs and should not be allowed in logical or physical proximity to facility SDAs.
System computer security architecture, including digital asset analysis 5.41. Asset analysis should include assessing the effects of credible scenarios of cyber‑attack on the system and the risk to the facility. The assessment should take account of the possibility that cyber‑attacks might occur during any stage of the lifetime of the facility or any phase of the system’s life cycle.
System computer security architecture, including digital asset analysis 5.42. Cyber‑attacks might affect an individual system or multiple systems and could be used in combination with other forms of malicious act causing physical damage. These potential specific component level interactions should be listed within the assessment report and assessed.
System computer security architecture, including digital asset analysis 5.43. The assessment should include consideration of malicious actions that could change process signals, equipment configuration data or software.
System computer security architecture, including digital asset analysis 5.44. The asset analysis should include identifying the locations at which information is stored and the pathways by which information flows within the system (including its digital assets). The analysis should also identify and justify the measures in place to protect the necessary data flows and communications and to identify any possible remaining vulnerabilities. The analysis could be supported by the following:
  • Analysing or testing the effectiveness of the security measures;

  • Documenting the status of the measures, including defining possible improvement points;

  • For identified systems, ensuring that the software has been subject to a vulnerability assessment.

System computer security architecture, including digital asset analysis 5.45. For example, consider the exchange of software (e.g. source code, object code) between a development environment and a security system. If no computer security measures are in place, then the compiler (hardware and software) will be assigned to the same zone (and computer security level) as the security system itself, since no boundary exists. However, if security measures are applied at the boundary between the compiler and the system — for example, testing the integrity of data and identifying any vulnerabilities in the code coming from the compiler — the compiler could be placed in a separate zone and assigned to a computer security level different from that of the system itself. The measures applied to the compiler’s output are accredited with protecting the system and so would be assigned the same level as the system to which they are providing the protection.
System computer security architecture, including digital asset analysis 5.46. The analysis of digital assets should produce a list and description of the specific computer security measures that are implemented for each system. The measures should be a combination of technical, administrative and physical control measures.
System computer security architecture, including digital asset analysis 5.47. The analysis of digital assets should provide a qualitative or quantitative value of the acceptable risk threshold.
Verification of the system computer security risk assessment 5.48. The operator should verify and validate the system computer security risk assessment for each system as defined by the scope of the assessment. The verification of system CSRM outputs may use the evaluation methods outlined in para. 4.98 for facility CSRM.
System scenario identification and development 5.49. The national threat statement or DBT provides a basis for the generation of credible scenarios based on the motivation, capabilities, intentions and opportunity of potential adversaries (including adversaries using cyber techniques).
System scenario identification and development 5.50. The operator should develop credible scenarios for each system on the basis of the threat characterization as a basis for the validation of the computer security measures that provide protection to the system. The credible scenarios should include potential sequences of adversary actions that might result in compromise of SDAs.
System scenario identification and development 5.51. Scenarios should include common attack routes and techniques. These may include the following:
  • Social engineering, including phishing attacks;

  • Malicious emails;

  • Malicious web sites;

  • Infected mobile media devices;

  • Compromised maintenance and inspection equipment;

  • Remote access;

  • Insiders (witting and unwitting);

  • Compromise of the supply chain.

System scenario identification and development 5.52. Scenarios should be developed consistent with the national threat statement or DBT that applies to the facility to identify those SDAs that might be exposed to such attacks. It may be beneficial to start scenario development by considering the most likely or the highest consequence cases.
System scenario identification and development 5.53. The development of scenarios should have the following aims (in order of significance):
  • To determine the highest consequence scenarios involving SDAs;

  • To determine the most likely scenarios involving digital assets, including SDAs.

System scenario identification and development 5.54. Evaluation methods para. 4.98) should use credible scenarios (paras 4.116–4.125) to verify the effectiveness of implemented computer security measures.
System scenario identification and development 5.55. The operator should verify that digital assets, including SDAs, are appropriately protected against the adversaries identified in the national threat statement or DBT that applies to the facility.
System computer security risk management report 5.56. The output of system CSRM should be documented in a report that includes the following:
  • Identification of all SDAs, including (as far as possible) all hardware and software components of each SDA.

  • Identification of digital assets that are components of, interface with, support or have the potential to access communication pathways connected to SDAs. These might include components of systems assigned a computer security level.

  • Identification of known vulnerabilities, deficiencies or weaknesses in the systems or components, for example potential procurement issues (e.g. supply of counterfeit or substandard parts), or human actions or omissions that might affect security.

  • Identification of technical, administrative and physical control measures.

  • Recommendations for implementation of countermeasures.

  • Recommendations for improvements to countermeasures (i.e. additional technical, administrative or physical control measures).

  • Identification of deficiencies in facility documentation or records.

  • Classification of sensitive information.

  • Access control lists for personnel and services.

  • Corrective actions, when adverse conditions appear.

  • Assessment of the residual system level risk.

  • Identification and description of other indicators that will assist in the evaluation of computer security (e.g. mean time between failures, mean time to repair, mean time to detect, mean time to recover, security quality metrics).

System computer security risk management report 5.57. The system CSRM report should be classified as sensitive information and protected accordingly.
System computer security risk management report 6.1. This section provides guidance specific to the different stages in the lifetime of a facility.
System computer security risk management report 6.2. The operator should review its plans for the facility against the regulations of the competent authority and identify issues that need to be addressed to meet regulatory requirements.
System computer security risk management report 6.3. The operator should ensure that it has a formalized methodology to perform a detailed facility CSRM process.
System computer security risk management report 6.4. The operator should develop the facility CSRM process as described in Section 4.
System computer security risk management report 6.5. The operator should verify that, provided that the DCSA specification can be met, the residual risk will not exceed the acceptable levels.
System computer security risk management report 6.6. The operator should plan the development of the competencies needed to support computer security during all stages in the lifetime of the facility.
System computer security risk management report 6.7. The planning stage may include activities in locations away from the intended facility site. The operator should apply computer security measures to the information used in these activities, and to other inputs to and outputs from the planning life cycle, that is sensitive information or makes use of sensitive information assets.
System computer security risk management report 6.8. The operator should include computer security considerations in the siting stage of the facility, because some activities supporting computer security can only be performed in relation to the specific site, not remotely or generically (e.g. establishment of isolated networks, access for computer incident response teams, identification of the availability of expertise in computer security in the local workforce).
System computer security risk management report 6.9. In its siting plans for the location of major equipment, the operator should take into account the need to allow for operation of physical control measures that will be necessary to complement computer security measures.
System computer security risk management report 6.10. In siting, the operator should consider the availability of local infrastructure to support computer security measures (e.g. emergency communications networks).
System computer security risk management report 6.11. The operator should use the output of the facility CSRM work conducted during the planning stage to ensure that the facility design process provides for computer security requirements for facility functions (expressed in the DCSA and the CSP) to be met as an integral part of the system engineering activities for the facility. This applies to the design of a new facility or to the modification of the design for refurbishment or modification of the facility during the operation stage of the facility.
System computer security risk management report 6.12. The design process should take into account computer security requirements that arise owing to the dependencies between facility functions, as identified during the facility CSRM process.
System computer security risk management report 6.13. Computer security requirements should be provided in sufficient detail to allow design decisions to be made, the design to be verified and design changes to be evaluated.
System computer security risk management report 6.14. The operator should perform system CSRM for each system, including verification at each step of the design of the computer security measures.
System computer security risk management report 6.15. Physical and remote accessibility of the SDAs within vital areas by an insider should be considered at the design stage.
System computer security risk management report 6.16. The operator should develop computer security validation criteria for the commissioning stage. Systems performing facility functions assigned the highest computer security levels should be independently validated.
System computer security risk management report 6.17. Staff knowledgeable in computer security from different parts of the operating organization should be involved in the design process to ensure the following:
  • Appropriate computer security requirements are included.

  • Design changes improve and do not degrade computer security.

  • The changes, as implemented, meet the defined computer security requirements.

  • The effectiveness review includes computer security.

System computer security risk management report 6.18. The design should include the necessary directions for implementation of the computer security requirements. Design information, such as analysis reports, should be retained so that it is available in the future to authorized users of the design.
System computer security risk management report 6.19. Because design documents might contain sensitive information related to computer security, all design documents should be classified according to the information classification scheme and protected accordingly.
System computer security risk management report 6.20. The operator should ensure that any computer security requirements that need to be followed by vendors, contractors and suppliers are specified in their contracts39 [19]. Vendors, contractors and suppliers should be required to have computer security management systems and secure engineering environments in place and to apply security by design to the SDAs that they produce or supply.
System computer security risk management report 6.21. The operator should ensure that physical, administrative and technical control measures are established during the construction process to maintain the preventive and protective measures required by the CSP and the DCSA. For example, if lockable doors are to be installed on an enclosure, the locks should be installed and placed under control before installing SDAs within the enclosure, or appropriate compensatory measures should be put in place.
System computer security risk management report 6.22. The operator should ensure that the following computer security actions are performed as required by the CSP and the DCSA during the construction stage:
  • Assurance activities (i.e. testing, assessments, audits);

  • Use of staging areas, with process and security controls to verify that SDAs have not been tampered with;

  • Management of staff and verification of products of vendors, contractors and suppliers (both on the site and working remotely), from fabrication to installation;

  • Supply chain evaluation and management, ensuring that the verified procurement process is followed consistently and is not tampered with.

System computer security risk management report 6.23. The operator should include the testing of computer security measures in its acceptance testing for the delivery of systems to the facility from the system provider.
System computer security risk management report 6.24. The operator should perform configuration and testing activities during system and DCSA integration (see Fig. 7) to meet computer security requirements. For example, the following activities should be performed:
  • Passwords and secondary authentication methods for digital assets should be changed according to approved procedures.

  • Development and construction accounts for digital assets should be removed, and technical control measures should be enabled.

  • System support tools (software and hardware) should be submitted for testing and evaluation using appropriate computer security measures.

System computer security risk management report 6.25. The operator should perform validation testing of the computer security measures. Validation of computer security measures and physical protection measures should be conducted jointly to ensure appropriate integration.
System computer security risk management report 6.26. If there is a conflict between safety measures and security measures, then the measures to ensure safety should be maintained and the operator should find a solution that also meets computer security requirements. Until such a solution is in place, compensatory computer security measures should be implemented to reduce the risk to an acceptable level and should be supported by a comprehensive justification and security risk analysis. The compensatory measures should not rely solely on administrative control measures for an extended period. The absence of a security solution should never be accepted.
System computer security risk management report 6.27. Review and approval of applicable CSP documents and supporting materials (required for system operation) should be completed prior to operation.
System computer security risk management report 6.28. The operator should assign continuing responsibility for design change, management, maintenance and operations of the entire CSP to an individual (supported as necessary by others with appropriate skills and knowledge).
System computer security risk management report 6.29. The operator should maintain documentation that describes how computer security measures are implemented, in compliance with the CSP, the DCSA and any externally imposed requirements.
System computer security risk management report 6.30. The operator should ensure that operational requirements are consistent with the computer security level of systems and digital assets. For example, the following might need to be considered:
  • Access restrictions, access control and monitoring may be different for equipment assigned to different computer security levels.

  • Different levels of trustworthiness check may be required for personnel working on different systems, depending on their assigned computer security level.

  • Duties may be segregated.

System computer security risk management report 6.31. Actions applied to systems as part of a vulnerability assessment might lead to plant or process instability and should therefore only be considered using test beds or spare systems, during factory acceptance tests or during long planned outages.
Maintenance 6.32. This section applies to short duration maintenance activities that are routinely performed during the operations stage. Extended maintenance (e.g. refurbishment, replacement of systems, repair) is addressed in the design, construction and cessation of operations stages.
Maintenance 6.33. The operator should ensure that maintenance activities are performed in a manner consistent with the computer security level of the system or digital assetbeing maintained. For example, in addition to the general considerations during operation listed in para. 6.30, the following steps should be taken:
  • The permitted maintenance activities should be specified.

  • The necessary access for maintenance should be identified and controlled.

  • Maintenance equipment may be restricted for use only within a specific computer security zone (or for a specific system or digital asset) or only for systems at a specific computer security level.

  • Secure maintenance environments may be required for some systems or digital assets.

Maintenance 6.34. Systems might be at greater risk during maintenance, when computer security measures might be removed or disabled. Furthermore, there may be additional access routes during maintenance, for example arising from the need to enable remote maintenance interfaces or the use of removable media to configure or upgrade software.
Maintenance 6.35. The operator should put adequate compensatory measures in place when the normal computer security measures are removed or disabled. Examples include the following:
  • Compensatory measures should provide physical protection when equipment is unlocked.

  • The need for remote interfaces for maintenance should be identified (and justified) in advance and appropriate computer security measures should be applied to such interfaces in accordance with the CSP.

  • The use of computer based tools (e.g. measurement, testing and calibration equipment) should be controlled and monitored to ensure that the tools are not compromised by cyber‑attack or provide a route to compromise the systems on which they are used. Computer equipment that may be temporarily connected to the system — such as testing or configuring equipment — should be protected against malicious software and unauthorized data transfer. The use of external equipment for such purposes should be minimized. Any such equipment should be inspected before it is brought into the facility.

  • Software should be checked to confirm that it is free from malicious software before it is loaded on to the system. This may include verifying that the software has not been tampered with and is authentic, for example through software signing with cryptographic hashes.

  • Safety measures (e.g. concurrent verification by a second party) can also be used for security purposes.

Maintenance 6.36. During the cessation of operations stage, large scale modifications may be conducted in parallel, affecting multiple systems.
Maintenance 6.37. The operator should consider applying compensatory measures to address any risk arising from modifications to or degradation of security systems resulting from environmental or structural changes. This may include placing a greater reliance on administrative control measures and on vendors, contractors and suppliers to implement such measures.
Maintenance 6.38. Examples of changes for which compensatory measures may be applied include the following:
  • Computer security architectures and measures being modified or disabled to allow modification work to take place.

  • Fluctuations in staffing levels, possibly including new personnel being brought on‑site to perform activities involving digital assets, including SDAs. This may require that additional trustworthiness checks or other measures be put in place to address the insider threat.

  • Significant replacement of components, which need the creation of a secure installation environment, secure storage, and additional measures for handling and secure sanitization of replaced SDAs.

Maintenance 6.39. When digital assets are decommissioned, the effect of this decommissioning (including any loss of integration with other digital assets outside the facility) on computer security should be evaluated and documented. If decommissioning of a system or digital asset reduces the effectiveness of computer security measures, the operator should put compensatory measures in place.
Maintenance 6.40. As the set of facility functions changes, the digital assets supporting these functions may be reassigned to a different computer security level or be unassigned. This might lead to a need to modify computer security measures for those digital assets.
Maintenance 6.41. The operator should ensure the secure destruction of any digital assets containing sensitive information that cannot be securely declassified when they are decommissioned.
Maintenance 7.1. The computer security policy and programme should provide the basis for computer security requirements defined by the results of facility and system CSRM (Sections 4 and 5, respectively) and in consideration of the specific stages in the lifetime of the facility (Section 6).
Maintenance 7.2. Computer security at nuclear facilities should be recognized by senior management and managers as a cross‑cutting discipline that needs specialized knowledge, expertise and skills.
Maintenance 7.3. Senior management has overall responsibility for computer security at a nuclear facility and needs awareness and understanding of the cyber threat and the potential adverse effect of a cyber‑attack on nuclear security.
Maintenance 7.4. Senior management should ensure that all the operator’s interactions with others and all internal processes are consistent with legal and regulatory requirements related to information and computer security.
Maintenance 7.5. Managers should promulgate the beliefs and values of nuclear security culture as they pertain to computer security. This includes promoting recognition that a credible threat exists from adversaries with cyber skills, and that these adversaries (including insider threats) might target nuclear facilities via a cyber‑attack or a blended attack.
Computer security policy 7.6. A computer security policy sets the high level computer security goals of an organization. The computer security policy should begin with a clear statement of why it is being established and should define the issue being addressed, as well as the goals and the consequences if the policy is not followed. The policy should be consistent with the State’s computer security policy and appropriate regulatory requirements. The policy should be enforceable and achievable and should include indicators that can be measured and audited.
Computer security policy 7.7. The operator’s computer security policy should take into account the results of facility CSRM (see Section 4). The computer security policy shouldrequire the protection of digital assets, including SDAs, against compromise from cyber‑attacks. Individual policy clauses should be clear and concise in identifying these requirements. Implementation of the requirements is addressed in detail in the CSP.
Computer security policy 7.8. The computer security policy should be endorsed and enforced by senior management. It should identify the organization or individual that owns the policy and the CSP.
Computer security policy 7.9. The computer security policy should be part of the overall facility security policy and should be coordinated with other relevant security responsibilities. When establishing a computer security policy, its effect on legal aspects and human resources also needs to be considered.
Computer security policy 7.10. The computer security policy may identify potential penalties and disciplinary actions against personnel not complying with the policy requirements.
Computer security policy 7.11. The computer security policy should be reflected in the CSP and through other lower level CSP elements that support implementation of computer security.
Computer security policy 7.12. The policy needs to set out clear indicators that will be used to demonstrate that policies are being met in all aspects and that each aspect is being performed satisfactorily.
Computer security programme 7.13. The CSP contains details of how the goals set out in the computer security policy are achieved. The CSP establishes the organizational roles, responsibilities, processes and procedures for implementing the computer security policy. A CSP may be specific to a facility (including its associated buildings and equipment) or an organization (including all its sites and organizational units).
Computer security programme 7.14. The CSP should be developed, exercised and maintained within the framework of the facility’s overall security plan.
Computer security programme 7.15. The CSP should take account of the results of facility CSRM (Section 4). Development of the CSP may include personnel involved in computer security, physical protection, safety, operations and information technology (IT). The CSP is illustrated schematically in Fig. 8.
Computer security programme 7.16. The CSP should be reviewed and updated (a) periodically to reflect developments in technology and threats and (b) in the event of computer security incidents or other nuclear security events.
Elements of the computer security programme 7.17. Reference [7] describes the elements of a CSP generally applicable for organizations within the nuclear security regime. Paragraphs 7.18–7.20 provide more specific details on elements of a CSP for nuclear facilities.
Elements of the computer security programme 7.18. The elements of the CSP should include addressing system vulnerabilities, applying computer security measures, performing risk analysis and conducting assurance activities to achieve an acceptable level of computer security risk.
Elements of the computer security programme 7.19. The elements of the CSP should be adapted and applied to the different stages in the lifetime of a facility and to different phases of the individual systems’ life cycles. Specific details of implementation in these different cases should be provided in the CSP.
Elements of the computer security programme 7.20. The operator should tailor the CSP to its facility, but it is suggested that as a minimum the following areas be included:
  • Organization and responsibilities:

    1. Organizational charts;

    2. Responsible persons and reporting responsibilities (see paras A.3–A.13 of the Appendix);

    3. Periodic review and approval process; (iv) Interfaces with other programmes, such as human resources, personnel related security, physical protection and training (see paras A.15–A.38 of the Appendix).

  • Risk, vulnerability and compliance management:

    1. Facility CSRM process and outputs (see Section 4);

    2. System CSRM process and outputs (see Section 5), including the process for the classification and identification of digital assets40, including SDAs;

    3. Frequency of security plan review and reassessment;

    4. Self‑assessment practices;

    5. Audit procedures and deficiency tracking and correction;

    6. Method for and occasions to start or repeat risk and vulnerability assessment;

    7. Regulatory and legislative compliance.

  • Security design and management:

    1. Fundamental security architecture (i.e. DCSA);

    2. Fundamental security design approaches (i.e. computer security levels and zones);

    3. Assignment of baseline computer security measures to each computer security level;

    4. Formalization of computer security requirements for contractors, vendors and suppliers, including maintenance contracts;

    5. Security considerations for the applicable stages in the facility lifetime (see Section 6).

  • Digital asset management:

    1. Attributes of digital assets (identification, computer security level, zone, location, associated consequences);

    2. Configuration management (hardware, operating systems, firmware, software applications, equipment status, associated configurations);

    3. Data flow and network diagrams identifying all external connections to other systems;

    4. Supplier information for assets.

  • Security procedures:

    1. Security incident handling;

    2. Business continuity;

    3. System backup, restoration and recovery;

    4. Supply chain;

    5. Access control;

    6. Information and communications management;

    7. Platform and application security (e.g. system hardening);

    8. System monitoring, including logging.

  • Personnel management:

    1. Trustworthiness checks;

    2. Awareness and training;

    3. Qualification of personnel;

    4. Reporting of security issues, including protection of staff reporting these issues;

    5. Termination or transfer.

  • Organizational charts;

  • Responsible persons and reporting responsibilities (see paras A.3–A.13 of the Appendix);

  • Periodic review and approval process; (iv) Interfaces with other programmes, such as human resources, personnel related security, physical protection and training (see paras A.15–A.38 of the Appendix).

  • Facility CSRM process and outputs (see Section 4);

  • System CSRM process and outputs (see Section 5), including the process for the classification and identification of digital assets40, including SDAs;

  • Frequency of security plan review and reassessment;

  • Self‑assessment practices;

  • Audit procedures and deficiency tracking and correction;

  • Method for and occasions to start or repeat risk and vulnerability assessment;

  • Regulatory and legislative compliance.

  • Fundamental security architecture (i.e. DCSA);

  • Fundamental security design approaches (i.e. computer security levels and zones);

  • Assignment of baseline computer security measures to each computer security level;

  • Formalization of computer security requirements for contractors, vendors and suppliers, including maintenance contracts;

  • Security considerations for the applicable stages in the facility lifetime (see Section 6).

  • Attributes of digital assets (identification, computer security level, zone, location, associated consequences);

  • Configuration management (hardware, operating systems, firmware, software applications, equipment status, associated configurations);

  • Data flow and network diagrams identifying all external connections to other systems;

  • Supplier information for assets.

  • Security incident handling;

  • Business continuity;

  • System backup, restoration and recovery;

  • Supply chain;

  • Access control;

  • Information and communications management;

  • Platform and application security (e.g. system hardening);

  • System monitoring, including logging.

  • Trustworthiness checks;

  • Awareness and training;

  • Qualification of personnel;

  • Reporting of security issues, including protection of staff reporting these issues;

  • Termination or transfer.

Elements of the computer security programme 7.21. Further information on CSP elements can be found in international standards [19–21].
Elements of the computer security programme 7.22. The operator should define computer security related roles and responsibilities within the organization.
Elements of the computer security programme 7.23. Managers should ensure that all staff understand who within the organization is responsible for leading the CSP in the functional areas relevant to their work. Staff with computer security responsibilities need to be trained in the elements of and requirements specified in the CSP.
Elements of the computer security programme 7.24. The management of computer security should be integrated into the existing management system for the facility (see paras 7.30–7.34) to the extent possible and practicable. For existing facilities, the management system will already include well defined roles and responsibilities, and these should be adjusted to incorporate computer security.
Elements of the computer security programme 7.25. Personnel with significant computer security responsibilities should not have conflicts of interest with other functions of the organization or with otherduties. Managers should put in place policies and processes to avoid or mitigate any potential conflicts.
Elements of the computer security programme 7.26. The operator should ensure that individuals or organizations performing key assessment and verification activities are appropriately qualified and independent.
Elements of the computer security programme 7.27. Computer security needs cooperation between staff in different roles and organizational units. The operator should put in place a formalized framework with the aim of ensuring interdisciplinary cooperation.
Elements of the computer security programme 7.28. The operator needs to identify the external and internal interfaces involved in the CSP. This includes the following:
  • Routine interfaces between the facility operator and relevant competent authorities (e.g. regulatory bodies, law enforcement, intelligence agencies, security services);

  • Reporting to competent authorities and interface with external response forces in the event of a security incident;

  • Internal interface with on‑site response team;

  • Public relations;

  • Relations with vendors, contractors and suppliers, including the supply chain.

Elements of the computer security programme 7.29. The operator should manage risk through a formalized process (i.e. facility and system CSRM) that assesses and manages risk and vulnerabilities at the facility. The operator should use the results of these processes within its management system.
Management system 7.30. The management system should be integrated to include computer security, physical protection, safety, health, environmental, quality and financial elements.
Management system 7.31. The management system should have formal and established interfaces with the facility and system CSRM.
Management system 7.32. The computer and information security goals should be defined and managed within the management system in a manner similar to other business objectives.
Management system 7.33. The management system should be reviewed to ensure its completeness and compliance with facility security policies. It should be periodically reviewed and adapted to changing conditions in the facility and in the environment. Figure 3 of Ref. [22] illustrates the continual improvement process for management systems.
Management system 7.34. The elements of the CSP (including facility and system CSRM) should be reviewed and the necessary provisions for computer security should be integrated into the management system.
Computer security indicators 7.35. Computer security indicators can be an effective tool for security managers to measure the maturity of the management system; the risk associated with potential cyber‑attacks affecting SDAs; the effectiveness of different components of their security programmes; the security of a specific system, product or process; and the ability of staff within the organization to address security issues for which they are responsible.
Computer security indicators 7.36. Indicators should support decisions concerning acceptable risk and provide an input to a risk registry.
Computer security indicators 7.37. An analysis should be performed to identify parameters and establish indicators that support effective management of the CSP. Indicators that may be useful include mean time to recover (from cyber‑attack), number of computer security incidents, number of restorations of SDAs (potential reoccurrences), security backlogs and vulnerability tracking information (e.g. common scoring system, mitigation effectiveness, control deployment time, patch deployment).
Computer security indicators 7.38. The use of the indicators should be integrated into the organization’s management system.
Computer security indicators 7.39. Facility and system security design is specified in facility and system CSRM (see Sections 4 and 5, respectively). One practical implementation of these outputs, namely the DCSA and measures assigned to computer security levels, is described in Section 8.
Computer security requirements 7.40. Modifications to the facility or system should be analysed to determine potential effects on security before changes are made to allow for risks to be managed.
Computer security requirements 7.41. Computer security should be considered as a factor when determining the design inputs, which include the following:
  • Functional requirements;

  • Interface requirements;

  • Operational requirements;

  • Location of equipment;

  • Environmental considerations;

  • Codes and standards to be used;

  • Contractual considerations;

  • Supply chain considerations;

  • Logistics (e.g. coordination of complex operations involving many people, facilities or supplies);

  • Past operating experience;

  • Introduction of new technologies;

  • Human factor considerations;

  • Design requirements for each engineering discipline (including computer security);

  • Fabrication considerations;

  • Installation;

  • Commissioning;

  • Decommissioning;

  • Financial considerations.

Computer security requirements 7.42. The operator should, for each digital asset, document attributes that have significance for computer security. These attributes may include the following:
  • Asset identifier and location;

  • Asset configuration;

  • Functions and operational modes;

  • Interconnections, including power supplies;

  • Data flow, including internal and external connections;

  • Procedures that initiate communication, frequency of communication and protocols for such communication;

  • Analysis of user groups;

  • Ownership (for data and computerized systems);

  • Computer security level and zone, and assessed consequences of failure.

Computer security requirements 7.43. Digital asset management should take into account the equipment status of technical control measures that use digital technology. Computer security operations and physical protection operations may have joint responsibility for integrated security measures, systems and procedures. Joint operational control may include control over physical devices used to protect computer equipment (e.g. rooms, doors, keys, locks, cameras, motion sensors, tamper indicators).
Configuration management 7.44. The goal of configuration management is to have detailed, up to date records of the installed software and hardware components and how they are configured. Configuration management should include information needed for the following:
  • Identifying the need for computer security measures;

  • Verifying that the computer security measures are implemented and configured correctly;

  • Managing changes throughout the life cycle of the systems;

  • Supporting computer security assessments;

  • Understanding the reasons for changes to the computer security measures.

Configuration management 7.45. Configuration management includes the change management process. Computer security should be included in this process such that all changes are evaluated from a computer security perspective before implementation. For example, appropriate reviews are performed and documented before carrying out procedures that could bypass, change or reduce the effectiveness of the computer security measures in place. Personnel changes may also necessitate changes relating to computer security (e.g. credential cancellation and management).
Configuration management 7.46. The operator should develop security procedures to support facility and system computer security design and management. During development of these procedures, the operator should consider the two person rule or segregation ofwork duties, taking into account the appropriate trust model and the security level assigned to the zone(s) applicable to the procedure.
Configuration management 7.47. Procedures that provide detailed instructions on how to disable or bypass computer security measures should ensure that such activities are recorded and logged. The procedure may also provide instructions for the application of alternate or compensatory computer security measures when the baseline computer security measure is disabled.
Configuration management 7.48. These procedures may be new stand‑alone procedures or may be integrated within existing procedures that meet one or more safety, security or organizational objectives.
Configuration management 7.49. Personnel management includes the necessary provisions for establishing an appropriate level of trustworthiness, enforcing confidentiality undertakings, defining required competencies and, where necessary, applying penalties or terminating employment.
Configuration management 7.50. Computer security activities and personnel related security activities should be coordinated to provide protection against insider threats. In particular, personnel with key security responsibilities (e.g. system administrators, security team) may require a higher level of trustworthiness. Further guidance on the protection from insider threats is given in Ref. [6].
Configuration management 7.51. The CSP should include provision of training and awareness raising to develop and maintain personnel and organizational competencies and qualifications that are necessary for computer security.
Configuration management 8.1. An example of the implementation of DCSA with five different computer security levels in a nuclear power plant is presented below. This is one possible implementation of the graded approach; the exact choice of levels, DCSA and computer security measures should be tailored according to the facility and its environment through specific analysis.
Configuration management 8.2. When implementing the DCSA, the operator should consider limiting the dynamic elements of networks and individual systems to make their behaviour more predictable. This increased predictability might help in the implementation of effective computer security measures.
Configuration management 8.3. Zones assigned the most stringent computer security level should only be connected to zones assigned lower levels of security by fail‑secure, deterministic, unidirectional data communication pathways. The direction of these data pathways should be from the zone with the more stringent computer security level to the zone with the less stringent computer security level.41 Exceptions are strongly discouraged and may only be considered on a strict case by case basis and if supported by a complete justification and security risk analysis.42
Configuration management 8.4. Digital devices or communications used for monitoring, maintenance and recovery should not bypass computer security measures used to protect communication pathways between devices that have different computer security levels.
Configuration management 8.5. Systems assigned to the most stringent computer security level should be placed within the most secure zone’s boundaries.43
Configuration management 8.6. Data communications between systems within the facility and the emergency centre (either on the site or off the site) should be protected by computer security measures.
Configuration management 8.7. Computer security measures that ensure the logical and physical decoupling of zones are based on the requirements of the zones’ computer security levels. To maintain defence in depth, a direct path connecting through several zones should not be allowed.
Configuration management 8.8. Technical control measures that provide security at the boundaries of zones should be designed to be resilient to cyber‑attack and to provide alerts in the event of potential compromise or malicious activity.
Configuration management 8.9. Where external connectivity is provided, security should be applied using the graded approach. The provision of external connectivity should meet the requirements for protecting the confidentiality, integrity and availability of sensitive information consistent with the computer security level assigned to the zone.
Configuration management 8.10. Appropriate access restrictions (including monitoring of access) should be applied to provide protection based on the graded approach because these external connections can serve as a route for compromise of systems at the facility.
Configuration management 8.11. Examples of externally accessible systems include the following:
  • Environmental monitoring systems;

  • Building automation systems;

  • Fire protection systems;

  • Communications with emergency centres;

  • Remote vendor access (where permitted);

  • Field devices located outside the physical security perimeter;

  • Visitor control.

Configuration management 8.12. Figure 9 gives an example of one implementation of a DCSA, showing levels, zones, systems and digital assets. This is based on the guidance provided in Section 3.
Configuration management 8.13. Example security requirements applied within each computer security level are presented in paras 8.16–8.21. The exact choice of levels and their security requirements should be tailored according to the facility and its environment through specific analysis.
Configuration management 8.14. Two types of unassigned digital asset may be encountered:
  • Restricted or proscribed equipment, where restrictions placed on the operator mean that the security of the digital assets cannot be assessed. This could be due to licence conditions, or to contractual, regulatory or legal requirements that prohibit the operator from inspecting and modifying the equipment (e.g. safeguards related equipment).

  • Unannounced equipment, which might be brought to the facility without being requested by or without the prior agreement of the operator. Such equipment is considered to be ‘contraband’ until a computer security risk assessment can be completed.

Configuration management 8.15. The operator may place restrictions on unassigned assets until they can be assessed and assigned to the appropriate computer security level and the required computer security measures can be put in place. Devices that are unassigned, for example, should not be brought into the proximity of systems that have medium to very high computer security levels.
Configuration management 8.16. For applicable systems and levels, the following generic requirements are applied:
  • All technical, physical, personnel and organizational security measures for systems and networks are designed and implemented in a systematic manner and according to approved processes and procedures.

  • Policies and practices are defined for each computer security level.

  • Users are obliged to comply with security policies and security operating procedures.

  • Staff permitted access to the system are suitably qualified and experienced and determined to be trustworthy where necessary.

  • Users and administrators have access only to those functions on those systems that they need for performing their jobs. Accumulation of access rights by an individual person is avoided.

  • The system’s functionality and interfaces are limited to the extent possible, with the objective of reducing overall system vulnerability.

  • Appropriate access control and user authentication are in place.

  • Protection against infection and spreading of malware is in place.

  • Security logging and monitoring, including procedures for adequate response, are in place.

  • Application and system vulnerabilities are monitored, and appropriate measures are taken.

  • The adequacy and effectiveness of measures is reviewed periodically.

  • System vulnerability assessments are undertaken periodically.

  • Removable media are controlled in accordance with security operating procedures. Privately owned devices are not allowed to be connected to systems and networks.

  • Digital assets and associated computer security measures are strictly maintained using the applicable change management procedures.

  • Appropriate backup and recovery procedures are in place.

  • A service device is assigned to exactly one computer security level.

  • Physical access to components and systems, including service devices, is restricted according to their functions.

  • Measures to prevent the unauthorized introduction of systems into computer security zones are in place.

  • Only approved and qualified users are allowed to make modifications to the systems.

Configuration management 8.17. In addition to the generic requirements, requirements for preventive and protective measures are used for systems that are vital to the facility and require the highest level of security (e.g. reactor protection systems). These requirements can include the following:
  • Systems are designed and implemented to be verifiable and testable against a potential attack by an adversary.

  • No networked data flow of any kind from systems assigned less stringent computer security levels can enter level 1 systems when integrity and availability are priorities. Only outward communication is possible. Exceptions are strongly discouraged and may only be considered on a strict case by case basis and if supported by a complete justification and security risk analysis.44

  • No remote maintenance access is allowed.

  • Physical and logical access to systems is strictly controlled, monitored and recorded.

  • The number of staff given access to the systems is limited to an absolute minimum.

  • The two person rule is applied to prevent unauthorized actions by an insider threat.

  • All activities and potential security events are logged and monitored.

  • Connection of external storage devices is approved and verified on a case by case basis.

  • Strict organizational and administrative procedures apply to any modifications, including hardware maintenance, software updates and software modifications.

Configuration management 8.18. In addition to the generic requirements, requirements for preventive and protective measures should be used for systems, such as operational control systems, that require a high level of security. These requirements can include the following:
  • Only an outward, unidirectional networked flow of data is allowed from level 2 to level 3 systems. Only necessary acknowledgement messages or controlled signal messages can be accepted in the opposite (inward) direction (e.g. for TCP/IP (Transmission Control Protocol/Internet Protocol)).

  • Remote maintenance is not allowed.

  • The number of staff given access to the systems is kept to a minimum, with a clear distinction between users and administrative staff.

  • Physical and logical access to systems is strictly controlled and documented.

  • Administrative access from other computer security levels is avoided. If this is not possible, such access is strictly controlled (e.g. by adopting the two person rule and two factor authentication).

  • All reasonable measures are taken to ensure the integrity and availability of the systems.

Configuration management 8.19. In addition to the generic requirements, requirements for preventive and protective measures should be used for real time systems that are not required for operations (e.g. process supervision systems in a control room), if all such systems have a medium severity level for various cyber threats. These requirements can include the following:
  • Access to the Internet from level 3 systems is not allowed.

  • Logging and audit trails for key resources are monitored.

  • Security gateways are applied to protect this level from uncontrolled data connections from level 4 systems and to allow only specific and limited activity.

  • Physical connections to systems are controlled.

  • Physical and logical access to systems is controlled and documented.

  • Remote maintenance access is allowed on a case by case basis provided that it is robustly controlled; the remote computer and user follow a defined security policy, specified in the contract.

  • System functions available to users are controlled by access control mechanisms and based on the ‘need to know’ rule. Any exception to this rule is carefully considered and protection is ensured by other means (e.g. physical access).

  • Administrative access from other computer security levels is avoided wherever possible. If this is not possible, such access is strictly controlled (e.g. through two factor authentication).

Configuration management 8.20. In addition to the generic requirements, requirements for computer security measures should be applied to technical data management systems that are used for maintenance or operation activity management related to components or systems required by the technical specification for operation (e.g. work permit, work order, tag out, documentation management), if such systems need medium levels of computer security. These requirements can include the following:
  • Access to the Internet from level 4 systems is not allowed.

  • Security gateways are implemented to protect this level from unauthorized data communications through trusted and approved external company or facility networks and to allow specific activities that are authorized.

  • Physical connections to systems are controlled.

  • Remote maintenance access is allowed but controlled; the remote computer and user follow a defined security policy, specified in the contract.

  • System functions available to users are controlled by access control mechanisms. Any exception to this rule is carefully considered and protection is ensured by other means.

  • Remote external access is allowed to selected services and for approved users, provided that appropriate access control mechanisms are in place.

Configuration management 8.21. Requirements specifying computer security measures should be used for systems not directly important to technical control or operational purposes (e.g. office automation systems), if such systems need low levels of computer security. These requirements can include the following:
  • The computer security level does not fall below a baseline protection level, defined according to the latest state of the art.

  • Only approved and qualified users are allowed to make modifications to the systems.

  • Access to the Internet from level 5 systems is allowed, provided that adequate preventive and protective measures are applied.

  • Remote external access is allowed for authorized users, provided that appropriate measures are in place.

  • Physical connection of third party devices to systems and networks is technically controlled. Those interfaces to higher level systems are characterized and evaluated independently to ensure compliance with the computer security architecture.

Configuration management A.1. This appendix provides examples of selected CSP elements for use withthe performance based approach to computer security. An operator may needto modify these elements to reflect particular organizational or facility specificcircumstances, but the examples cover all the types of information that theoperator needs to develop and implement an effective CSP.
Configuration management A.2. The operator should require these or similar elements to facilitateunderstanding between organizational units, vendors, contractors and suppliers,and competent authorities. The elements may need to be tailored to the specificcharacteristics of the operating organization and facility to improve understanding.
Management A.3. Senior management at a facility establishes a computer security policyas well as processes and support mechanisms to ensure that the policy isimplemented. To achieve this, senior management should take the following steps:
  • Assume overall responsibility for all aspects of computer security.

  • Define security objectives for the facility.

  • Ensure compliance with relevant laws and regulations.

  • Maintain awareness of the current nuclear security threat and associatedtrends.

  • Set the risk acceptance level for the facility.

  • Assign organizational responsibilities for computer security.

  • Ensure adequate communication between personnel responsible for different aspects of nuclear security.

  • Ensure compliance with the computer security policy.

  • Provide adequate resources to implement a sustainable CSP.

  • Ensure periodic reviews and updates of the computer security policy and procedures.

  • Ensure support for training and awareness programmes.

Computer security specialist A.4. The operator should assign overall responsibility for computer security at the facility to one individual or group. In this publication, the title ‘computer security specialist’ is used to define that role.46
Computer security specialist A.5. The computer security specialist should coordinate closely with activities throughout the facility, but in an independent manner. The computer security specialist should have clear and accessible reporting lines directly to senior management, as computer security can affect almost all facility activities.
Computer security specialist A.6. Computer security responsibilities within different organizational departments should be clearly defined and coordinated to avoid gaps or conflicts and to ensure that computer security is implemented in a coherent manner. This is especially necessary if the computer security specialist role is assigned to a group rather than to one individual: the computer security specialist should constitute one single authority within the operating organization, responsible for addressing organization‑wide issues and resolving any conflicts that might arise.
Computer security specialist A.7. The computer security specialist should have in‑depth knowledge of computer security and good knowledge of other aspects of security in nuclear facilities, as well as knowledge of nuclear safety and project management and the ability to integrate people from different disciplines into an effective team.
Computer security specialist A.8. The computer security specialist should have the authority and responsibility for administering the CSP.
Computer security specialist A.9. The typical specific responsibilities of the computer security specialist include the following:
  • Advising senior management on computer security.

  • Leading the computer security team.

  • Advocating computer security within the organization, including improvements when necessary.

  • Coordinating and controlling the development of computer security activities (e.g. implementing computer security policy, specific directives and guidelines, procedures and, ultimately, computer security measures).

  • Coordinating with physical protection personnel and other security and safety personnel to plan and specify computer security measures, including those to respond to computer security incidents.

  • Identifying systems critical to computer security within the facility (i.e. those providing baseline computer security measures). Asset owners should be informed of their equipment’s role in computer security.

  • Conducting periodic computer security risk assessments, independently from operational staff.

  • Conducting periodic inspections, audits and reviews of the baseline computer security measures and providing status reports to senior management.

  • Developing and arranging computer security training and qualification of relevant personnel.

  • Preparing for and leading the response to computer security incidents, including coordination with relevant internal and external personnel involved in the response.

  • Investigating computer security incidents and developing post‑incident corrective actions.

  • Participating in assessment of overall facility security.

  • Participating in analysis of requirements for new computer based systems.

Computer security team A.10. The operator should identify and assign personnel to a computer security team. This team can be a fixed group of individuals or can include individuals with specific expertise as needed. The team supports the computer security specialist in fulfilling their responsibilities: the computer security specialist needs to have access to expertise in all disciplines associated with computer security, including facility safety and plant operations as well as physical protection and personnel related security.
Computer security team A.11. Members of the computer security team should be responsible for advocating computer security in their respective organizational units.
Computer security team A.12. The computer security team’s activities include actively monitoring digital assets, including SDAs, for any indications of a possible cyber‑attack, and coordinating response to computer security incidents. This might include staffing a security operations centre for the monitoring and assessment of potential computer security incidents and for the initiation and support of response activities, which might also need support from other organizations.
Other management responsibilities A.13. Managers at different levels within the organization should ensure that appropriate attention is paid to computer security within their areas of responsibility. Typical responsibilities of managers in their respective areas include the following:
  • Understanding the significance and the role of computer security in nuclear security;

  • Operating within the requirements and processes defined by the CSP;

  • Providing operational requirements and feedback to senior management relevant to computer security, and resolving any conflicts between operational, security and safety requirements;

  • Alerting senior management to any conditions that might lead to changes in the level of computer security, such as changes in personnel, equipment or processes;

  • Ensuring that staff are sufficiently trained and briefed on computer security issues relevant to their roles;

  • Ensuring that vendors, contractors and suppliers working for them operate within the requirements and processes defined by the CSP;

  • Tracking, monitoring, responding to and reporting on computer security incidents;

  • Enforcing computer security measures.

Individual responsibilities A.14. Each individual within an organization should be responsible for performing their own tasks consistently with the CSP. Specific responsibilities include the following:
  • Understanding the significance and the role of computer security in nuclear security;

  • Understanding the organization’s policy for computer security;

  • Having knowledge of computer security procedures relevant to their job;

  • Operating within the constraints implied by the computer security policy;

  • Notifying managers of any changes that might adversely affect computer security;

  • Notifying relevant points of contact and managers of any incidents or possible incidents involving a compromise of computer security;

  • Attending initial training on computer security and refresher training on a regular basis.

Cross‑department responsibilities A.15. Computer security is a cross‑cutting discipline that affects and is affected by many different organizational units and activities. Computer security needs close coordination and cooperation between different organizational units to be effective. Paragraphs A.16–A.38 describe some of the departmental responsibilities and cross‑cutting issues.
Physical protection A.16. The site security plan and the CSP are both essential in developing a comprehensive security plan for the facility, and they therefore need to complement each other. SDAs are protected by physical access control requirements, and compromise of computer based systems can lead to degradation or loss of physical protection functions. Furthermore, adversaries might seek to attack a facility through coordinated cyber‑attack and physical attack (i.e. blended attack).
Physical protection A.17. If the organizational units responsible for the site security plan and the CSP are different, they should communicate and coordinate their efforts to ensure consistency between the plans during the development and review process.
Physical protection A.18. The operator should assign relevant roles and responsibilities in the development, implementation and maintenance of the CSP to physical protection personnel. These may include the following:
  • Ensuring that only authorized access is permitted to SDAs;

  • Identifying unauthorized removable media and mobile devices entering the facility;

  • Identifying unauthorized removal of information or information assets from the facility;

  • Ensuring that policies applicable to any removable media and mobile devices permitted in the facility are applied (e.g. scanning for malicious software before entry into the facility);

  • Reporting computer security incidents (e.g. detection of malicious software, unauthorized removal of information assets) according to the incident response procedure;

  • Assessing information security practices (e.g. desk checks, checking of locked rooms and cabinets, provision of standards for devices providing physical protection of information assets, access control and monitoring);

  • Supporting incident response for computer security incidents related to the physical protection system.

Information technology A.19. IT personnel perform support, management and administrative tasks within a nuclear facility. These tasks may include activities involving digital assets used to prepare and store operational and maintenance procedures, work instructions, configuration management systems, design documents and operating manuals.
Information technology A.20. The CSP should clearly identify the digital assets and associated networks that are the responsibility of the IT personnel. IT personnel should monitor the identified digital assets and associated networks and report any computer security incidents to senior management and the computer security specialist according to the incident response plan.
Information technology A.21. IT personnel should take actions to ensure that computer security incidents involving digital assets (but not SDAs) and networks do not propagate to affect SDAs.
Engineering A.22. Engineering personnel should have formal processes to ensure coordination with other relevant organizational units to ensure that measures for nuclear security and nuclear safety are designed and implemented in an integrated manner consistent with the requirements set out in the CSP. Engineering personnel should recognize that safety, physical protection and computer security are distinct disciplines that need support from appropriately qualified experts in those different disciplines.
Engineering A.23. Engineering personnel should provide evidence of the effectiveness of the computer security architecture (i.e. the DCSA) that can be compared with the results expected on the basis of facility and system CSRM.
Engineering A.24. Engineering personnel should lead or support the system CSRM process for those facility systems of which they are the owner.
Engineering A.25. Engineering personnel should provide direction to vendors, contractors and suppliers regarding requirements for computer security within facility systems. Engineering personnel are responsible for reviewing vendors’ designs to ensure that they meet the computer security requirements. Engineering personnel should seek confirmation from the vendor that products supplied to the facility have been developed in a secure environment. Engineering personnel should establish and follow a procedure for reviewing technical product documentation, accepting on‑site product consignments and testing products to ensure that computer security requirements are met.
Engineering A.26. Engineering personnel should ensure that performance monitoring activities are in place to confirm that computer security measures continue to be effective.
Operations A.27. The CSP should identify those facility systems and networks that are the responsibility of operations personnel. Operations personnel are responsible for complying with the requirements for these systems set out in the CSP.
Operations A.28. Operations personnel should ensure that the DCSA and computer security measures under their responsibility are maintained and remain effective.
Operations A.29. Operations personnel should ensure that procedures are in place for identifying computer security incidents and initiating response for systems and networks under their responsibility.
Operations A.30. Operations personnel should promote situational awareness to ensure that only authorized removable media and mobile devices are used within the facility.
Procurement and supply chain organization A.31. Products should be procured to meet the specifications for the equipment, device or component. The specifications should include appropriate computer security requirements.
Procurement and supply chain organization A.32. Procurement processes should include checks to ensure that SDAs developed or supplied by vendors and suppliers include computer security measures consistent with each SDA's assigned computer security level.
Procurement and supply chain organization A.33. Procurement personnel should understand the importance of specific computer security requirements in procurement. These requirements should be enforced through legal agreements with vendors, contractors and suppliers, such as licences or contracts.
Procurement and supply chain organization A.34. Procurement and engineering personnel might not know that a general purpose device will be classified as an SDA if the operator uses it in a particular application. In such cases, the devices should be procured taking into account the possibility that they might be deployed as SDAs, and appropriate computer security requirements should be applied.
Procurement and supply chain organization A.35. Procurement personnel should work with engineering personnel to ensure that computer security requirements are specified as contractual requirements for vendors, contractors or suppliers and that designs submitted by vendors, contractors or suppliers meet computer security requirements. Procurement personnel should also inform engineering personnel if support from a vendor, contractor or supplier for an SDA is, or appears likely to be, no longer available.
Procurement and supply chain organization A.36. Procurement personnel should consider conducting reviews of vendors, contractors and suppliers before entering into contractual agreements. Such reviews may include analysis of the processes used by the vendor, contractor or supplier to design, develop, test, implement or support SDAs or assessment of the vendor, contractor or supplier’s training and experience in developing SDAs with the required levels of computer security. The reviews may also help (a) determine whether primary vendors, contractors or suppliers have in place security measures to properly evaluate the trustworthiness of subordinate vendors, contractors and suppliers and (b) ensure the provenance of SDAs, SDA components, and software and updates provided to the operator.
Procurement and supply chain organization A.37. Procurement personnel should ensure that all vendors, contractors and suppliers of SDAs have procedures in place to notify the operator in case of any supply chain incidents with the potential to affect SDAs (e.g. compromise of SDA components, SDA technology, development processes or sensitive information).
Procurement and supply chain organization A.38. Procurement personnel should consider ensuring that vendors, contractors and suppliers of SDAs have a trusted distribution route for delivering SDAs, SDA components, and software and updates to the operator.
External relationships and interfaces for risk management A.39. Risk management processes should include analysis of external relationships (i.e. vendors, contractors and suppliers). Responsibility and accountability for meeting requirements derived from system CSRM should be specified in contractual arrangements.
External relationships and interfaces for risk management A.40. The operator should audit and inspect relevant activities of vendors, contractors and suppliers to ensure that computer security requirements set out in the CSP are being met. Contracts with vendors, contractors and suppliers should require them to allow the operator to perform these activities.
External relationships and interfaces for risk management A.41. The operator’s risk management processes should take account of regulatory requirements and other external requirements affecting computer security. The operator should provide for relevant competent authorities to maintain oversight and perform inspections in respect of measures to meet these requirements.
Computer security assurance A.42. Computer security assurance activities should be conducted throughout the lifetime of the facility, as described in Sections 4 and 5. The specific assurance activities will vary according to the stage in the lifetime. Reference [8] provides details of assurance activities applicable to I&C systems.
Computer security assurance A.43. Such activities by an operator might include assessments (including audits), reviews, exercises and testing47.
Computer security assurance A.44. The operator should verify that the CSP is consistent with the operator’s computer security policy (e.g. computer security assessment may be used to verify that computer security requirements reflecting the operator’s policy are met). This may involve a number of complementary assessments to evaluate different elements of the CSP and their implementation. The outputs of the assessments will include identification of deficiencies and good practices, and suggestions for improvement.
Computer security assurance A.45. These activities should form the basis for continual improvement of the CSP. To support this, assurance activities should be repeatable and reliable, and should be conducted on a periodic basis, as well as whenever a computer security incident occurs or the threat changes.
Computer security assurance A.46. Assurance activities should include the evaluation of organizational effectiveness and the measures in place to ensure correct implementation and effectiveness of computer security.
Computer security assurance A.47. Assurance activities may be performed by internal or external groups: for example, computer security assessment can be performed by an internal team as a self‑assessment activity. If the assessment is performed by external groups, the results need to be verified internally.
Computer security assurance A.48. Internal and external assurance activities should be complemented by independent evaluations performed by external parties. Independent assessors will need access to relevant staff, documentation and equipment. Independent assessors may be members of the operating organization or external to the organization, but they need to be independent of the people who performed, verified and supervised the work being assessed.
Computer security assurance A.49. The trustworthiness of independent or external assessors should be determined before they are permitted access to the information or facility, as the assurance activities are likely to involve sensitive computer security information. Further information on trustworthiness assessments is given in Ref. [6].
Computer security assurance A.50. The procedures for independent assessment should include appropriate restrictions on the removal, use, storage and distribution of sensitive information and should provide for the destruction of such information when it is no longer needed.
Computer security assurance A.51. The capabilities to conduct assurance activities should be developed and maintained to keep pace with changes in technology and the cyber threat. These capabilities are needed by both the staff performing the assurance activities and the competent authority, which might need to review the results of these activities.
Assessment scope A.52. The operator should identify the scope of the assessment in terms of the functional and security domains.
Assessment scope A.53. The scope should be appropriate to the stage of the lifetime of the facility. For example, a complete assessment of computer security might be needed during some stages, whereas in other stages, assessment of specific functional or security domains might be more appropriate. (Reference [8] identifies assessment activities at various points in the I&C system life cycle.)
Assessment evaluation techniques A.54. An assessment team should use the following techniques, as appropriate, to acquire the information the team needs to develop its conclusions and recommendations:
  • Review of documents and records (e.g. legislation, regulations, facility records);

  • Interviews with personnel from the relevant organizations, such as competent authority personnel, operating personnel of the facility and representatives of other organizations;

  • Direct observation of the organization, its practices and systems, and the implementation of computer security measures.

Assessment report development A.55. The data collection component of the assessment consists of recording observations and data of interest from the review of documents and records, interviews with staff, and direct observations. Observations might be individually significant but might also act as a collective indicator of trends at the facility or organization that might need to be addressed. Therefore, the operator should identify observations that support findings indicating trends or recurring issues.
Assessment report development A.56. The observations should be analysed by comparison with requirements such as national regulations, organizational procedures and industry standards, as appropriate. A finding is identified if there is non‑compliance with a regulatory requirement or internal procedure. The basis used for identifying findings needs to be well defined and agreed in the planning stage of the assessment.
Assessment report development A.57. Observations do not always result in findings, and not all findings are negative: they may include identification of good practices, organizational practices or procedures that provide an effective, typically novel, method for meeting security objectives. Good practices for potential adoption by other organizations to improve their own computer security may be identified and reported.
Assessment report development A.58. In addition to findings and good practices, the assessment team may also provide recommendations and suggestions in the assessment report associated with the findings.
Assessment report development A.59. Recommendations provide guidelines for meeting legal and regulatory requirements or international norms (e.g. convention obligations) when appropriate. Recommendations do not normally include how to correct a problem, but rather only identify that a problem needs to be corrected.
Assessment report development A.60. Suggestions provide an additional level of information regarding a finding, including suggested corrective or mitigatory measures. Such information is not necessarily derived from regulatory guidance, but more typically from industry technical standards and good practice.
Example assessment method A.61. An example assessment method is described in Ref. [23]. The example method provides a cross‑domain assessment of a facility’s functional operations and its computer security. This assists in ensuring coverage of processes and systems that perform facility functions, including operations, safety, security, and emergency preparedness and response.
Example assessment method A.62. Computer security measures that protect SDAs should be managed under a configuration management plan. Such a plan should be developed and implemented by the operator and should include the following measures:
  • Assign relevant roles and responsibilities, and define configuration management processes and procedures.

  • Detail the configuration of the SDAs and their interactions.

  • Identify when in the system development life cycle the SDAs are placed under configuration management.

  • Establish the means for identifying SDAs and a process for managing computer security measures to protect them.

Example assessment method A.63. A current baseline configuration of SDAs should be maintained under configuration control. The baseline configuration should be updated as necessary on the basis of system performance monitoring and, for example, to reflect system hardening or the effects of modifications on computer security.
Example assessment method A.64. The operator should consider putting in place a systematic process for system hardening of SDAs. System hardening is the application of a combination of administrative and technical control measures designed to make computer system components less vulnerable to cyber‑attack by removing or disabling hardware and software components that are not needed for the operation or maintenance of the system. Hardware and software typically removed or disabled include the following:
  • Unused network interfaces or protocols (including disabling of driver software);

  • Unused peripherals (including disabling of driver software);

  • Support for removable media;

  • Unauthorized wired and wireless communications;

  • Messaging services not related to the facility functions performed by the system;

  • Social media services and applications;

  • Servers or clients for unused services;

  • Software compilers in user workstations and servers, except for those used for system development;

  • Software compilers for languages that are not used in the control system;

  • Unused networking and communications protocols;

  • Unused administrative utilities, diagnostics, network management and system management functions;

  • Backups of files, databases and programs used during system development;

  • Unused data and configuration files;

  • Sample programs and scripts;

  • Unused document processing utilities;

  • Unnecessary add‑ins for applications (e.g. browsers);

  • Games.

Example assessment method A.65. System hardening should be mandatory for SDAs that use commercial ‘off the shelf’ components, the functionality of which should be reduced to that needed to perform the SDAs’ facility functions (or system functions).
Example assessment method A.66. System hardening should aim to reduce the amount of data that need to be monitored and analysed to determine the security of the protected digital asset or system. System hardening can also help the operator better understand the normal behaviour and functionality of the system.
Example assessment method A.67. System hardening may include the use of technology to ensure that only the approved versions of authorized computer programs are allowed to run on the SDA. The records of system hardening should include documentation of the libraries that the technology has used.
Example assessment method A.68. System hardening should use only secure, trusted update mechanisms. These update mechanisms should be assessed to ensure that they eliminate or minimize the potential for the update to be used as a route to attack the system being updated by, for example, ensuring that system updates are identified by encrypted signatures of authorized vendors.
Example assessment method A.69. Vendors issue computer security updates, typically in the form of ‘patches’, to address vulnerabilities identified in their systems. Since modifications to safety systems need to follow resource intensive procedures, the immediate installation of a patch might not be possible, leaving the system at risk for some period of time.
Example assessment method A.70. The operator should obtain from the vendor or develop itself a list of software components used in the systems and the applicable software updates (including security patches).
Example assessment method A.71. The operator should have a formal process in place to ensure that computer security updates to equipment and components are assessed to determine their applicability and effect and, specifically, whether immediate installation is necessary to mitigate the associated vulnerability. The operator should either install the update or provide effective compensatory measures appropriate to protect against exploitation of the vulnerability.
Example assessment method A.72. The operator should identify and implement computer security measures that provide robust security to allow for the assessment of updates and associated vulnerabilities without those vulnerabilities being exploited during the period of assessment and installation. For example, system hardening could reduce the number of security updates that need to be assessed and installed, as updates that affect only functionality that has been removed or disabled need not be installed.
Example assessment method A.73. All systems covered by the CSP should be assigned an owner (e.g. a system engineer) who is responsible for monitoring the system.
Example assessment method A.74. System monitoring should include monitoring the status and effectiveness of computer security measures.
Example assessment method A.75. The system owner should be responsible for ensuring that recovery media and configuration information are up to date and that system recovery plans are maintained and can be executed when necessary (e.g. through regular exercise of the recovery plan).
Example assessment method A.76. Configuration changes to an SDA should be controlled with explicit consideration for security consequence analyses. The manager or asset owner should approve any configuration changes to an SDA prior to implementation of the changes. This approval should be formally documented.
Example assessment method A.77. Activities associated with change to the configuration of an SDA should be reviewed by the computer security specialist. Records of changes to the configuration of an SDA should be prepared, retained and reviewed.
Example assessment method A.78. The computer security specialist should have overall responsibility for oversight of configuration change control activities involving SDAs but may delegate this responsibility to asset owners. The computer security specialist should put in place requirements to ensure that effective oversight is performed and coordinated.
Example assessment method A.79. The continual monitoring of the effectiveness of a CSP in practice should include the evaluation of CSP components through exercises.
Example assessment method A.80. Exercises for information and computer security can combine assessment with training. Exercises should also include scenarios involving blended attacks incorporating coordinated cyber‑attack and physical attack.
Example assessment method A.81. The information and computer security management system may be exercised in a graded way for personnel with different roles and at different levels within the organization. Exercises test how effectively work processes and communications function in responding to a computer security incident; they also provide training for all levels of personnel involved in the management and response.
Example assessment method A.82. The operator should consider the benefit of the following:
  • Exercises of security procedures to test the effectiveness of the procedures in meeting the objectives of the CSP;

  • Drills to train personnel in carrying out the security procedures and thereby improve awareness of the procedures, the rationale for the tasks in the procedure and the response to computer security incidents.

Example assessment method A.83. The operator should consider whether to perform intrusive testing (simulating a real cyber‑attack on real systems) as part of the evaluation of a system’s or a digital asset’s computer security, taking account of legal, safety and security considerations and of the operator’s capability to avoid or remediate any adverse effects caused to the digital asset and system. Reference [8] identifies specific restrictions on intrusive testing of I&C systems.
Example assessment method A.84. Since the detailed method of a cyber‑attack will be strongly dependent on the exact configuration of the systems attacked, a system being tested needs to be as similar to the real system as possible. Full backup and restore procedures should be in place to return the system to a known stable state if an assessment test creates abnormal conditions.
Example assessment method A.85. A test plan should specify the schedule and budget for testing and identify the goals of the testing, the expected deliverables, the hardware and software to be used, the resources needed, the rules of engagement and a recovery procedure.
Example assessment method A.86. Testing techniques may include the following:
  • ‘Fingerprinting’, which involves identifying and quantifying all communications within and between components in a system and analysing the effects of these communications on the SDAs to which the tests relate. Fingerprinting a network provides the following:

    1. A network baseline;

    2. An accurate network diagram;

    3. Identification of any rogue devices or malicious data communications;

    4. Verification that boundary protection devices are working as designed;

    5. Identification of opportunities to improve zoning and perimeter protection.

  • ‘Fuzzing’, which aims to find bugs or vulnerabilities in a component or system by injecting a wide variety of data in an automated fashion to identify types of data and injection points that might be used for malicious purposes. This can identify weaknesses in software coding and provide an indication of system hardness.

  • A network baseline;

  • An accurate network diagram;

  • Identification of any rogue devices or malicious data communications;

  • Verification that boundary protection devices are working as designed;

  • Identification of opportunities to improve zoning and perimeter protection.

Example assessment method A.87. Computer security indicators can provide a common basis for evaluating vulnerabilities. Well chosen and commonly agreed indicators (e.g. a common scoring system for vulnerabilities) provide a common basis for comparing vulnerabilities across different systems. The operator should assess possible ways in which identified vulnerabilities could be exploited and take measures to prevent such exploitation. The operator should consider reporting all vulnerabilities for inclusion in a national vulnerability database.
Example assessment method A.88. Computer security staff should be responsible for reporting any suspected computer security incidents according to the incident response plan. The operator should consider providing specialized awareness training for personnel in key roles not directly related to computer security but that could be affected by failures in computer security.
Example assessment method A.89. The operator should have a contingency plan to detect and respond to computer security incidents that might potentially affect SDAs (and for any other nuclear security events that involve computer security incidents). The plan should provide procedures to identify the location and nature of the threat, prevent or mitigate the consequences of any malicious act, notify relevant competent authorities, and recover from the event.
Example assessment method A.90. Incident response is a collection of activities (see Fig. 10), each of which should be considered.
Example assessment method A.91. Computer security incidents can involve compromise of the confidentiality, integrity and/or availability of the data processed, stored or transmitted by a computer based system. A computer security incident might also involve violation of an explicit or implied computer security policy, an acceptable use policy or a standard computer security practice. Some adverse events (e.g. floods, fires, electrical outages, excessive heat) can cause a system outage but are not the result of malicious acts and therefore are not considered to be computer security incidents.
Example assessment method A.92. A computer security incident might become an information security incident or breach if it involves the actual or suspected compromise of sensitive information. Reference [5] provides examples of potentially sensitive information associated with nuclear facilities.
Example assessment method A.93. The operator should create a local computer security incident response team, which is responsible for responding to computer security incidents within the organization. The size, composition and capabilities of a computer security incident response team will depend on the nature of the organization and its computing infrastructure, but it should include personnel with expertise in nuclear security, nuclear safety, and emergency preparedness and response as well as computer security. The computer security incident response team may have the same membership as, or some members in common with, the computer security team.
Example assessment method A.94. A computer emergency response team is a technical authority that provides assistance and response capabilities when a computer security incident occurs. The computer emergency response team may exist at different levels (e.g. national, local, industrial sector). The computer emergency response team may be available to supplement the internal computer security response capabilities of an operating organization in responding to any computer security incident. The availability of this team to respond during times of crisis should be considered in planning the operating organization’s response activities.
Example assessment method A.95. The operator should ensure the participation in exercises of any computer emergency response team members who would be involved in response as well as the computer security incident response team members. Interfaces between the computer emergency response team and the computer security incident response team, including preparatory activities (e.g. preclearance of computer emergency response team members for access to identified areas of the facility) should be considered. Exercises should be designed to test the key communication items between the competent authorities, the computer emergency response team, the computer security incident response team and site operations, as shown in Fig. 11.
Preparation A.96. Planning actions in the preparation phase include establishing a policy that will guide the operational processes for responding to computer security incidents, defining the roles and responsibilities of all parties involved in the incident response, drafting procedures consistent with the policy, and identifying assets available for response. Requirements and criteria for use in responding to computer security incidents need to be clearly defined. The plan of response actions should be approved by senior management.
Detection and analysis A.97. During the detection and analysis phase, the computer security incident response team should be responsible for the technical characterization of the incident. Detection activities include ensuring that there is adequate data monitoring in place to support detection through the collection and preservation of information related to possible incidents. The computer security incident response team may use a dedicated testing and evaluation environment to analyse incidents without affecting operational systems or disturbing potential forensic evidence.
Detection and analysis A.98. Analysis activities may extend beyond the computer security incident response team and the initial technical characterization of the incident, and some aspects of the analysis may require extensive resources. Typical priorities for analysis include the following:
  • Determining the potential effects of the computer security incident on nuclear security, safety and emergency preparedness and response, and identifying actions to place the facility in a safe condition;

  • Identifying the extent of the incident to determine an adequate response;

  • Determining the potential damage from the computer security incident in terms of information loss, physical damage to the facility and public perception;

  • Determining the nature of the computer security incident with regard to the adversary’s immediate intent and possible future threats, including the possibility of a future attack exploiting effects from this incident;

  • Identifying the root cause of the computer security incident and the measures needed to prevent or mitigate the effects of future incidents of a similar nature;

  • Identifying the adversary and developing a profile of the adversary, including the techniques and tools used and the vulnerabilities exploited by the adversary.

Mitigation (containment, eradication and recovery) A.99. Mitigation actions aim to contain a computer security incident; eradicate any malware or correct any mal‑operation or altered configuration from the affected systems; and recover system function and data integrity, using compensatory measures where necessary. Even if the compromised components or systems do not perform a critical safety or security function, they need to be checked and cleared to prevent propagation of the attack to a component or system that does perform such a function. Mitigation activities continue and are adapted as information is collected and analysed during the detection and analysis phase.
Mitigation (containment, eradication and recovery) A.100. When planning how to contain computer security incidents, the operator should recognize that a number of components or systems may be identified during the incident investigation as having been compromised. If any of the compromised components or systems provide a critical safety or security function — such as contributing to the protection of SDAs, the safe operation of the facility or the protection of nuclear or other radioactive material — it will be necessary to implement compensatory measures to perform that function until the component or system can be brought back into operation.
Mitigation (containment, eradication and recovery) A.101. Recovery measures may include like‑for‑like replacement (e.g. a backup firewall); isolation of safety structures, systems and components from the compromised component or system; or temporary measures, such as a guard to control access to the relevant part of the facility to replace a digital access control system. Recovery measures need to replace the function, not necessarily the compromised component or system.
Post‑incident activities A.102. The last phase of response is post‑incident activities to implement measures that will prevent the recurrence of similar types of computer security incident in the future, enable their rapid detection and/or minimize their consequences. This phase may include learning lessons within the organization and sharing intelligence on threats and lessons learned, as appropriate, with the wider computer security incident response community to help prevent a similar attack from succeeding elsewhere. Post‑incident findings may allow the development of new security measures to prevent re‑infection and provide information to update threat and vulnerability profiles. Other post‑incident activities may include evaluating the effectiveness of the CSP and identifying training to address any gaps in the response of personnel, as well as assessing the resources that were needed to address the computer security incident as a guide to planning for future incidents.
Reporting A.103. During the response to a computer security incident there may be situations in which reporting to competent authorities (or other organizations) is required or desirable. Reporting allows everyone who needs to know about a computer security incident to be informed in a timely manner. Since those responding to the incident are likely to be busy, the operator needs to consider carefully the frequency of reporting and the level of detail provided. The operator may consider assigning a specific individual as the point of contact for computer security incident reporting and for requests for information from outside organizations.
Reporting A.104. Activity planning should ensure that the computer security requirements for the performance and verification of the activities are identified and planned.
Reporting A.105. Required personnel and contractor qualifications related to computer security should be identified for the activities being performed, and this should be taken into account in the planning. Each responsible organization has the responsibility to report suspected computer security incidents according to the incident response plan.
Reporting A.106. When developing work instructions, computer security requirements need to be taken into account. These could include instructions for the following:
  • Removal of computer security measures (to allow for maintenance);

  • Provision of alternate or compensatory measures (while normal measures are unavailable);

  • Reapplication of computer security measures (following maintenance);

  • Confirming that computer security measures have been correctly re‑established.

Reporting A.107. Maintenance instructions should include instructions for configuring the security settings on devices.
Reporting A.108. If maintenance requires the disposition of equipment that is no longer required, this equipment should be sanitized or securely destroyed.
Reporting A.109. Procurement requirements related to computer security should be identified and implemented in the work plan.
Reporting A.110. Although computers are used in many aspects of work and personal life, there is a general lack of awareness and knowledge regarding the technology, cyber threats, computer security measures and the possible effects of compromise. Awareness raising and training in computer security are needed for all personnel and contractors in organizations that have nuclear security responsibilities.
Reporting A.111. Human error causes or adversely contributes to computer security incidents. Staff at all levels need awareness and constant reaffirmation of computer security.
Reporting A.112. Awareness of its importance can support computer security as follows:
  • By promoting understanding that computer security supports not only the nuclear security of the facility but also the safety of the facility;

  • By ensuring a common understanding of the key aspects of computer security within the organization;

  • By encouraging observation and coaching of colleagues, reporting of potential computer and information security incidents, and situational awareness;

  • By promoting understanding that cyber‑attacks can affect multiple security and/or safety measures simultaneously, reducing defence in depth;

  • By providing a means by which conflicts between safety and security objectives can be resolved;

  • By recognizing and promoting good practices in computer security;

  • By raising awareness of how humans can inadvertently contribute to computer security incidents.

Reporting A.113. The following indicators may be used to evaluate awareness of computer security in an organization:
  • Computer security requirements are clearly documented and well understood by staff.

  • Clear and effective protocols and procedures exist for operating computer systems both inside and outside the organization.

  • Staff members understand and are aware of the importance of the computer security measures set out in the CSP.

  • Computer systems are kept secure and operated in accordance with the computer security baseline and approved procedures.

  • Breaches of computer security procedures are regarded by all as serious and undesirable.

  • Results of observations, evaluations, tests and exercises are positive (e.g. testing indicates that staff do not respond to phishing emails).

  • Managers are fully committed to and supportive of security initiatives, whether related to cyber or to physical systems.

Reporting A.114. The aim of a computer security training programme is to ensure that personnel and contractors have the knowledge and capability to perform their work in accordance with the facility computer security requirements and procedures. Computer security training should be incorporated into an existing training management system.
Reporting A.115. The operator should have a training programme with the following elements:
  • A computer security training programme, successful completion of which is a precondition for access to computer systems. Individuals’ training should be commensurate with the computer security levels of systems to which they will have access.

  • Specialized training and qualification for individuals with key security responsibilities (e.g. computer security specialist, computer security team, other security officers, project managers, IT administrators, system engineers, designers, technicians, document management personnel, project personnel, procurement personnel, contractors, senior management).

  • Training materials that are updated on a regular basis to include new procedures and measures to address emerging threats.

  • Training that is repeated on a regular basis to ensure that staff are familiar with the latest procedures and threats.

  • A requirement for staff to acknowledge that they understand their computer security responsibilities.

  • Practical evaluations of staff’s understanding of their computer security responsibilities.

Reporting A.116. A variety of training approaches should be used, such as e‑learning, classroom training, practical exercises and discussion forums48. External organizations, including the IAEA, can provide materials to support such activities.
Reporting A.117. The training programme should include (a) indicators for evaluating computer security awareness and the effectiveness of training and (b) processes for continual improvement and periodic refresher and update training for staff, as needed.
Reporting A.118. An example process for planning response to computer security incidents can be found in Ref. [25].
  • INTERNATIONAL ATOMIC ENERGY AGENCY, Objective and Essential Elements of a State’s Nuclear Security Regime, IAEA Nuclear Security Series No. 20, IAEA, Vienna (2013).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Nuclear Security Recommendations on Physical Protection of Nuclear Material and Nuclear Facilities (INFCIRC/225/Revision 5), IAEA Nuclear Security Series No. 13, IAEA, Vienna (2011).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Nuclear Security Recommendations on Radioactive Material and Associated Facilities, IAEA Nuclear Security Series No. 14, IAEA, Vienna (2011).

  • EUROPEAN POLICE OFFICE, INTERNATIONAL ATOMIC ENERGY AGENCY, INTERNATIONAL CIVIL AVIATION ORGANIZATION, INTERNATIONAL CRIMINAL POLICE ORGANIZATION–INTERPOL, UNITED NATIONS INTERREGIONAL CRIME AND JUSTICE RESEARCH INSTITUTE, UNITED NATIONS OFFICE ON DRUGS AND CRIME, WORLD CUSTOMS ORGANIZATION, Nuclear Security Recommendations on Nuclear and Other Radioactive Material out of Regulatory Control, IAEA Nuclear Security Series No. 15, IAEA, Vienna (2011).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Security of Nuclear Information, IAEA Nuclear Security Series No. 23‑G, IAEA, Vienna (2015).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Preventive and Protective Measures against Insider Threats, IAEA Nuclear Security Series No. 8‑G (Rev. 1), IAEA, Vienna (2020).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Computer Security for Nuclear Security, IAEA Nuclear Security Series No. 42‑G, IAEA, Vienna (2021).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Computer Security of Instrumentation and Control Systems at Nuclear Facilities, IAEA Nuclear Security Series No. 33‑T, IAEA, Vienna (2018).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, National Nuclear Security Threat Assessment, Design Basis Threats and Representative Threat Statements, IAEA Nuclear Security Series No. 10‑G (Rev. 1), IAEA, Vienna (2021).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Security during the Lifetime of a Nuclear Facility, IAEA Nuclear Security Series No. 35‑G, IAEA, Vienna (2019).

  • INTERNATIONAL ORGANIZATION FOR STANDARDIZATION, INTERNATIONAL ELECTROTECHNICAL COMMISSION, Information Technology — Security Techniques — Information Security Management Systems — Overview and Vocabulary, ISO/IEC 27000:2018, ISO, Geneva (2018).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, IAEA Safety Glossary, Terminology Used in Nuclear Safety and Radiation Protection, 2018 Edition, IAEA, Vienna (2019).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Safety of Nuclear Power Plants: Design, IAEA Safety Standards Series No. SSR‑2/1 (Rev. 1), IAEA, Vienna (2016).

  • INTERNATIONAL ORGANIZATION FOR STANDARDIZATION, INTERNATIONAL ELECTROTECHNICAL COMMISSION, Information Technology — Security Techniques — Information Security Risk Management, ISO/IEC 27005:2018, ISO, Geneva (2018).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Physical Protection of Nuclear Material and Nuclear Facilities (Implementation of INFCIRC/225/Revision 5), IAEA Nuclear Security Series No. 27‑G, IAEA, Vienna (2018).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Identification of Vital Areas at Nuclear Facilities, IAEA Nuclear Security Series No. 16, IAEA, Vienna (2013).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Safety Classification of Structures, Systems and Components in Nuclear Power Plants, IAEA Safety Standards Series No. SSG‑30, IAEA, Vienna (2014).

  • INTERNATIONAL ORGANIZATION FOR STANDARDIZATION, INTERNATIONAL ELECTROTECHNICAL COMMISSION, Information Technology — Security Techniques — Evaluation Criteria for IT Security, ISO/IEC 15408:2009, ISO, Geneva (2009).

  • INTERNATIONAL ORGANIZATION FOR STANDARDIZATION, INTERNATIONAL ELECTROTECHNICAL COMMISSION, Information Technology — Security Techniques — Information Security Management Systems — Requirements, ISO/IEC 27001:2013, ISO, Geneva (2013).

  • INTERNATIONAL ELECTROTECHNICAL COMMISSION, Nuclear Power Plants — Instrumentation and Control Systems — Requirements for Security Programmes for Computer‑Based Systems, IEC 62645:2014, IEC, Geneva (2014).

  • INTERNATIONAL ORGANIZATION FOR STANDARDIZATION, INTERNATIONAL ELECTROTECHNICAL COMMISSION, Information Technology — Security Techniques — Code of Practice for Information Security Controls, ISO/IEC 27002:2013, ISO, Geneva (2013).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, The Management System for Nuclear Installations, IAEA Safety Standards Series No. GS‑G‑3.5, IAEA, Vienna (2009).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Conducting Computer Security Assessments at Nuclear Facilities, IAEA, Vienna (2016).

  • NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, Computer Security Incident Handling Guide, NIST SP 800‑61, Rev. 2, NIST, Gaithersburg (2012).

  • INTERNATIONAL ATOMIC ENERGY AGENCY, Computer Security Incident Response Planning at Nuclear Facilities, IAEA, Vienna (2016).

Reporting I–1. This annex provides some examples of ways in which adversaries could exploit vulnerabilities in systems performing critical facility functions. However, these are only examples, and operators need to think creatively about computer security to imagine how adversaries might act and how computer security measures might counter their actions.
Reporting I–2. The examples are derived from discussions with experts from Member States. They are not intended to provide an exhaustive list of possibilities or a recipe for attacking nuclear facilities, but rather a starting point for facility operators and Member States to develop plans to address the dynamic, rapidly changing cyber threat environment.
Reporting I–3. A coordinated cyber‑attack might consist of several phases:
  • Identifying a target or targets;

  • Performing reconnaissance;

  • Obtaining access to or otherwise compromising relevant systems;

  • Carrying out the attack;

  • Concealing evidence about the attack and the adversary.

Reporting I–4. Adversaries will use some or all of these tactics, and they need to be considered when developing cyber threat profiles specific to nuclear facility instrumentation and control (I&C) systems and other sensitive digital assets (SDAs). The example scenarios presented in this annex include the use of these tactics and illustrate common types of attack suggested by computer security experts with experience of the nuclear industry.
Reporting I–5. Types of threat are described in Ref. [I–1].
Reporting I–6. Goal of the attack: To gain access to nuclear information and digital assets by exploiting a trusted path used by vendors to provide support.
Reporting I–7. Description: The attack is initially directed at the Internet based remote access portal through which vendors have access to sensitive information and facility SDAs to provide support. The adversary compromises the portal and, via privilege escalation, gains administrative control over the database and changes the email address associated with a specific vendor. This vendor has remote access to critical operational information about the facility and some of the SDAs. The adversary uses the ‘forgotten password’ function on the portal, which sends a password refresh link to the email address introduced by the adversary. The adversary uses this link to change the vendor’s password and logs in to the portal with the identity of the authorized vendor. Once logged in, the adversary has access to all the information on the portal and all the SDAs to which the vendor has access. The adversary then begins to modify the settings and operational parameters of SDAs, leading to operational instability and ultimately to the shutdown of the facility.
Reporting I–8. Goal of the attack: To gain access to internal SDAs and systems.
Reporting I–9. Description:
  • Using open source tools and search engines, the adversary locates the perimeter network1 server used to report production information related to nuclear isotopes from trusted internal systems to the Internet. This server resides on the perimeter network but is populated by a master database server on the same network as the control system for a facility that produces nuclear isotopes. The master database server collects information from the internal manufacturing production environment and sends this information to the database located on the perimeter network. The perimeter network is separated from the production network by a firewall, which is configured with an access control list to ensure that only the database on the perimeter network server can communicate to the master database.

  • The adversary exploits a vulnerability to obtain administrative access to the server on the perimeter network and takes control of the communication channel between that server and the master database server on the control system network. The firewall is configured to allow communications between the perimeter network and the master database (i.e. it establishes ‘transitive trust’ between the networks), so the adversary, who has control of the server on the perimeter network, can connect directly to the master database on the control system network.

  • The adversary uses the connection to the master database to perform reconnaissance and enumeration of the control system assets that are on the same network. Since there are no security measures on the control system network, the adversary is able to take control of SDAs and compromise the technology controlling isotope development, management, transport, storage and inventory.

Reporting I–10. Goal of the attack: To force the shutdown of a nuclear power plant.
Reporting I–11. Description:
  • An engineer at a nuclear power plant works at home on a laptop computer that is used to support plant engineering and optimization, update performance programmes and ‘tune’ software for safety monitoring.

  • While at home, the engineer uses the computer to access a vendor’s web site and obtain a software update for the plant I&C systems that are instrumental in supporting plant operations. While the update is downloading, the engineer uses an on‑line bank, visits the corporate web site and uses social media, during which malicious software is downloaded to the computer. This malware is new and is not detected by the antivirus software on the computer.

  • Since corporate policy prohibits taking the computer into the plant, the engineer copies the downloaded control system update to a USB storage device, intending to use this to apply the software updates to the I&C assets. However, the malware has also copied itself to the USB device, and when the engineer uses it to install the update through an engineering workstation in the plant, the malware copies itself onto the plant system. The plant operator has assumed that the physical protection measures in place will prevent an unauthorized computer from connecting to the plant control system network, and the possibility of infection via removable media has not been considered.

  • After the malware infects the engineering workstation, it replicates and moves to other networked components within the plant. Since the operator has not deployed computer security measures at the plant level and there is no antivirus software on critical plant systems, the malware infects critical digital assets on the network, causing failures and forcing the plant to shut down.

Reporting I–12. Goal of the attack: To obtain enough information to plan an accurate attack on plant operations.
Reporting I–13. Description:
  • An adversary collects information from social media and observation indicating that a nuclear facility will be procuring a control system in the form of a system upgrade. In addition, the facility operator intends to sell old operational equipment to help pay for the new control system.

  • Since the facility has no formal decommissioning procedure related to information security, a system that was used to run critical I&C operations is sold without reviewing or removing information stored in it. The adversary buys the system and discovers up to date project files, network diagrams, username and password information, and other data that provide a comprehensive understanding of the nuclear facility operations.

  • The adversary uses this information to develop a plan to attack specific SDAs used at the facility and to create convincing emails for use in a phishing campaign. Ultimately, the adversary uses both the information obtained from the purchased system and that unwittingly provided by victims of the phishing campaign to launch a blended attack on the facility.

Reporting I–14. Goal of the attack: To obtain, through social engineering, information from a facility security officer that can be used to further an attack.
Reporting I–15. Description:
  • An adversary conducts a focused social engineering campaign against a facility security officer using phishing, physical reconnaissance and publicly available information, including that from the officer’s social media presence.

  • The adversary, with a false identity, uses this information to begin communicating directly with the security officer, who gradually comes to trust the adversary, believing that it is someone else. As the correspondence continues, the adversary starts to add credible email attachments that are actually malicious software that, when activated, covertly opens a communication path back to the adversary’s computer and sends specific files from the security officer’s computer to the adversary. With this information, the adversary is able to create accurate and detailed plans to attack the plant’s physical protection systems and intercept nuclear material in transit.

Reporting [I–1] INTERNATIONAL ATOMIC ENERGY AGENCY, Computer Security for Nuclear Security, IAEA Nuclear Security Series No. 42‑G, IAEA, Vienna (2021).
Reporting II–1. The assignment of computer security levels to systems (or zones containing systems) is based on the potential consequences of an attack on each system for the safety, security and operation of the facility: the less tolerable the consequences, the more stringent the computer security level.
Reporting II–2. To avoid case by case analyses of every system and potential consequence, criteria can be established to facilitate the assignment of the computer security levels.
Reporting II–3. One fundamental consideration is the safety classification of the system. However, there is not an automatic connection between computer security levels and safety classes. A stringent computer security level is needed for a system important to safety, but a stringent level may also be needed for systems with no safety classification if they have a critical role in preventing severe potential consequences for security.
Reporting II–4. An example graded approach to computer security levels uses the following high level criteria:
  • Computer security level 1 is assigned to plant digital systems for which compromise of their integrity or availability could lead to radiological consequences to the population off the site. This corresponds to the criterion for 1E/F1A safety classified systems (corresponding to systems supporting category A functions in the International Electrotechnical Commission safety scheme [II–1]).

  • Computer security level 2 is assigned to plant digital systems for which compromise of their integrity or availability could degrade one or more of the following:

    (i)The management of an emergency;

    (ii) Plant safety in normal operation;

    (iii) The main nuclear process operation;

    (iv) The physical protection of the plant.

  • Computer security level 3 is assigned to plant digital systems for which compromise of their integrity or availability has no radiological consequences, nor an adverse effect on safety or physical protection, but could have other major effects. Such systems might include, in particular, digital assets assisting plant operation or maintenance, or systems that could have an effect on power generation.

  • Computer security level 4 is assigned to plant digital systems for which compromise of their integrity or availability has no short term effect on plant performance but can have such an effect in the longer term.

  • Computer security level 5 is assigned to plant digital systems for which compromise of their integrity or availability has no effect on safety, on plant availability or on the performance of the facility.

Reporting II–5. In addition to these high level criteria, the definition of the computer security levels can include a list of typical facility functions or types of system that are specific to each level. This list could simplify the assignment of computer security levels to systems.
Reporting II–6. The computer security level classification focuses on potential consequences related to compromise of computer based systems (see Ref. [II–2]). In many cases, information acquired or calculated by a digital system can also be obtained with analogue tools or by a person, in which case the computer security level can be less stringent (and therefore less restrictive for normal operations).
Reporting II–7. When several diverse digital assets are used for the same function, a primary system supporting the function needs to be chosen and assigned to a computer security level according to the consequences of its compromise.
Reporting [II–1] INTERNATIONAL ELECTROTECHNICAL COMMISSION, Nuclear Power Plants — Instrumentation and Control Important to Safety — General Requirements for Systems, IEC 61513:2011, IEC, Geneva (2011).
Reporting III–1. This annex provides an example of the application of computer security levels and zones. Table III–1 provides a list of systems used in this example and shows the mapping of computer security levels to the physical and logical zones used in this example.
Reporting III–2. For simple systems, consisting of a small number of assets in well defined physical locations, application of the computer security levels and physical and logical zones is straightforward. It is more complicated for complex systems that extend throughout the facility or for physical areas that contain systems that need to be assigned to multiple security levels, such as the main control room.
Reporting III–3. Typically, the main control room contains controls for many different categories of systems that have differing security requirements (e.g. safety systems, steam supply (boiler), electrical systems, auxiliary systems, IT systems). The human–machine interfaces for all facility systems are entirely or partly in the main control room. These systems and human–machine interfaces typically use digital assets to perform their functions.
Reporting III–4. In old facilities, this creates difficulties in the application of computer security for several reasons:
  • Older human–machine interface consoles typically include controls for multiple systems, especially for balance of plant and auxiliary systems. This aggregation can increase the difficulty in providing for isolation and separation of these systems. In some cases, facility functions performed by systems assigned to different computer security levels may be combined into one human–machine interface console, to which the most stringent security level needs to be applied.

  • The digital assets located within the physical area of the main control room and its equipment rooms would, using the approach of computer security levels and zones, be assigned different computer security levels. For example, a reactor protection system may be assigned the most stringent level (e.g. security level 1), while a personal computer providing the operator with access to email may be assigned the least stringent level (e.g. security level 5).

  • Personnel performing authorized activities on a system within the main control room may have access to other equipment within the main control room.

Reporting III–5. The following illustrative example is provided to explain potential computer security solutions for the issues described above, in terms of the concepts detailed in Fig. 1 of the main text.
Reporting III–6. Application of computer security zones to the main control room (along with the physical protection and fire protection systems) is difficult because of the need for centralized monitoring and management of facility functions. The computer security zone concept allows for physical and/or logical boundaries, which can help to address these limitations. The relationship is illustrated in Fig. III–1.
Reporting III–7. The main control room (and the rooms within the protected area containing electronic equipment) is assumed to be classified and protected as a vital area. This implies that sabotage of equipment within the main control room could ultimately result in unacceptable radiological consequences.
Reporting III–8. Table III–1 provides an example of a subset of systems that need monitoring, communications or operation from within the main control room.
Reactor protection system (computer security level 1) III–9. In Table III–1, the most stringent computer security level (level 1) has a requirement that both the logical and physical computer security zone boundaries be specified strictly and that these boundaries do not extend past each other. For example, the dedicated network can be constrained to locations within the vital area (or equivalent).
Reactor protection system (computer security level 1) III–10. Physical and logical access to zones assigned computer security level 1 needs to be strictly controlled. Physical access can be controlled using a robust barrier with access control and intrusion detection to meet the requirements recommended in Ref. [III–1], and logical access can be controlled through a fail‑secure, unidirectional data communication pathway (e.g. a data diode) in accordance with the guidance in this publication and Ref. [III–2].
Reactor protection system (computer security level 1) III–11. Typically, systems that perform the facility function of preventing accident conditions (e.g. those on a reactor protection system) will be assigned to the most stringent computer security level. The equipment providing the function will be located in a vital area close to the reactor, but the equipment will be monitored through a human–machine interface in the main control room. This creates a potential problem with applying computer security zones, since the interconnection between the reactor protection system and the human–machine interface might be routed outside the vital areas (e.g. in the protected area), which would violate the physical security requirement.
Reactor protection system (computer security level 1) III–12. One solution would be to separate the monitoring function from the function of preventing accident conditions. This would allow for logical separation by means of a data diode between the digital assets in the vital area preventing accident conditions and those outside the vital area used for monitoring in the main control room. This solution would only be effective if the function of preventing accident conditions was independent and did not need any action or information from outside the systems assigned to perform the function.
Reactor protection system (computer security level 1) III–13. The digital assets credited with preventing accident conditions will be assigned to the most stringent computer security level (level 1) on the basis of facility function. These digital assets will be located in a vital area outside the main control room. The digital assets credited with monitoring the reactor protection system (e.g. the reactor protection system human–machine interface console in the main control room) will be assigned security level 2 (or higher).